How young people join violent extremist groups -- and how to stop them | Erin Marie Saltman

169,206 views ・ 2017-09-18

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: nr chan
00:12
So in 2011, I altered my name
0
12653
3398
在 2011 年,我改了名字,
00:16
so that I could participate in Far Right youth camp in Hungary.
1
16075
3976
這樣我才能參與 在匈牙利的極右派青年營。
00:20
I was doing a PhD looking at youth political socialization --
2
20614
4366
我當時在攻讀博士, 研究青年政治社會主義化──
00:25
why young people were developing political ideologies
3
25004
3071
為什麼在後共產主義的背景下,
00:28
in a post-communist setting,
4
28099
2010
年輕人會發展出政治意識形態,
00:30
and I saw that a lot of young people I was talking to
5
30133
3233
我和很多年輕人談過,我看見他們
00:33
were joining the Far Right,
6
33390
1600
加入極右派,
00:35
and this was astounding to me.
7
35014
2156
這讓我很吃驚。
00:37
So I wanted to enroll in this youth camp
8
37194
2295
所以我想要參加這個青年營,
00:39
to get a better understanding of why people were joining.
9
39513
3136
進一步了解為什麼人們會加入。
00:42
So a colleague enrolled me,
10
42673
1557
一位同事幫我加入了,
00:44
and my last name sounds a little bit too Jewish.
11
44254
2928
而我的姓氏聽起來太有猶太味,
00:47
So Erin got turned into Iréna,
12
47500
2745
所以艾琳被換成艾琳娜,
00:50
and Saltman got turned into Sós,
13
50269
2200
沙特曼被換成沙許,
00:52
which means "salty" in Hungarian.
14
52493
2921
它在匈牙利語的意思是「鹹的」。
00:55
And in Hungarian, your last name goes first,
15
55438
2310
匈牙利語會把姓氏放在前面,
00:57
so my James Bond name turned into "Salty Irena,"
16
57772
4414
所以我的詹姆士龐德(情報員) 假名就成了「鹹的艾琳娜」,
01:02
which is not something I would have naturally chosen for myself.
17
62210
3483
這不是個我會幫自己選的名字。
01:06
But going to this camp,
18
66100
1907
但,參加這個青年營,
01:08
I was further shocked to realize that it was actually really fun.
19
68031
4523
讓我進一步感到震驚, 因為我發現它其實很好玩。
01:13
They talked very little about politics.
20
73000
2215
他們很少談政治。
01:15
It was mostly learning how to ride horses,
21
75239
3013
主要是在學如何騎馬、
01:18
shooting a bow and arrow,
22
78276
1868
如何射箭,
01:20
live music at night,
23
80168
1687
晚上有現場音樂表演,
01:21
free food and alcohol,
24
81879
1969
食物和酒精飲料都免費,
01:23
also some air-gun target practice
25
83872
2866
還有空氣槍的打靶練習,
01:26
using mainstream politicians' faces as targets.
26
86762
3654
用主流政治人物的臉當目標。
01:30
And this seemed like a very, actually, friendly, inclusive group
27
90440
3727
這其實感覺就像是個 非常友善、包容的團體,
01:34
until you started talking or mentioning anything to do with the Roma population,
28
94191
5413
但當你開始談到或提到和吉普賽人口、 猶太人、或移民相關的事時,
01:39
Jewish people or immigrants,
29
99628
2262
就不是這樣好玩了,
01:41
and then the discourse would become very hate-based very quickly.
30
101914
4150
交談接著就馬上會變成 以恨意為基礎。
01:46
So it led me into my work now,
31
106663
2810
它導致我開始做現在的這個研究,
01:49
where we pose the question,
32
109497
2381
在這研究中我們提出一個問題:
01:51
"Why do people join violent extremist movements,
33
111902
3040
「為什麼人們要加入 暴力極端主義運動、
01:54
and how do we effectively counter these processes?"
34
114966
3031
以及我們如何有效地 對抗這些過程?」
01:58
In the aftermath of horrible atrocities and attacks
35
118393
3291
在發生在比利時、法國、全世界的
02:01
in places like Belgium, France, but all over the world,
36
121708
3363
恐怖暴行和攻擊的餘波中,
02:05
sometimes it's easier for us to think,
37
125095
1833
有時我們這樣想會比較容易:
02:06
"Well, these must be sociopaths,
38
126952
1945
「這些人一定是反社會者,
02:08
these must be naturally violent individuals.
39
128921
3064
這些人一定天性就很暴力。
02:12
They must have something wrong with their upbringing."
40
132009
2596
他們在養育過程中一定有出問題。」
02:14
And what's really tragic
41
134629
2087
很不幸的是,
02:16
is that oftentimes there's no one profile.
42
136740
2191
通常,他們並非特定形象的人。
02:18
Many people come from educated backgrounds,
43
138955
3254
許多人來自受過教育的背景,
02:22
different socioeconomic backgrounds,
44
142233
2096
來自不同的社會經濟背景,
02:24
men and women, different ages,
45
144353
2848
有男有女,年齡都不同,
02:27
some with families, some single.
46
147225
2278
有些人有家庭,有些人單身,
02:29
So why? What is this allure?
47
149527
2655
所以……為什麼?誘因是什麼?
02:32
And this is what I want to talk you through,
48
152206
2049
我來這裡就是想與大家談這點、
02:34
as well as how do we challenge this in a modern era?
49
154279
2887
以及在現代我們要如何挑戰它?
02:38
We do know, through research,
50
158531
1483
透過研究,我們確實知道
02:40
that there are quite a number of different things
51
160038
2356
有許多樣不同的因素
02:42
that affect somebody's process of radicalization,
52
162418
3351
會影響人的激進化過程,
02:45
and we categorize these into push and pull factors.
53
165793
2770
我們把這些因素分類為 「推式」和「拉式」因素。
02:48
And these are pretty much similar for Far Right, neo-Nazi groups
54
168587
3413
從極右派、新納粹團體、 一路到伊斯蘭極端主義
02:52
all the way to Islamist extremist and terrorist groups.
55
172024
2904
及恐怖分子團體, 都是很類似的情況。
02:55
And push factors are basically what makes you vulnerable
56
175483
3858
推式因素基本上是指讓你脆弱、
02:59
to a process of radicalization,
57
179365
1858
讓你容易被激進化過程影響的因素,
03:01
to joining a violent extremist group.
58
181247
2206
因而會加入暴力極端主義團體。
03:03
And these can be a lot of different things,
59
183477
2126
可能的因素有很多,
03:05
but roughly, a sense of alienation, a sense of isolation,
60
185627
3913
不過,大致上來說, 包括疏離感、孤立感、
03:09
questioning your own identity,
61
189564
2151
執疑你自己的身份、
03:11
but also feeling that your in-group is under attack,
62
191739
2826
還有感覺到你的小團體受到攻擊,
03:14
and your in group might be based on a nationality or an ethnicity
63
194589
3793
而你的小團體可能是指 同民族、同人種、
03:18
or a religion,
64
198406
1326
同宗教的人,
03:19
and feeling that larger powers around you are doing nothing to help.
65
199756
3611
並覺得你周圍比你更強大的 那些力量都沒有出手幫忙。
03:23
Now, push factors alone do not make you a violent extremist,
66
203895
3421
單有推式因素並不會 讓你變成暴力的極端主義者,
03:27
because if that were the fact,
67
207340
1430
因為如果是那樣的話,
03:28
those same factors would go towards a group like the Roma population,
68
208794
3270
像吉普賽人這類的團體 也應該會有這些因素,
03:32
and they're not a violently mobilized group.
69
212088
2781
而他們並不是個 有暴力行動傾向的團體。
03:34
So we have to look at the pull factors.
70
214893
2287
所以我們也得要看拉式因素。
03:37
What are these violent extremist organizations offering
71
217204
3310
這些暴力極端主義組織提供了什麼
03:40
that other groups are not offering?
72
220538
1945
是其他團體沒有提供的?
03:42
And actually, this is usually very positive things,
73
222507
2563
事實上,答案通常 是些很正面的事物,
03:45
very seemingly empowering things,
74
225094
2017
似乎很能夠賦權的事物,
03:47
such as brotherhood and sisterhood
75
227135
2463
比如兄弟情誼、姐妹情誼、
03:49
and a sense of belonging,
76
229622
1334
以及歸屬感,
03:50
as well as giving somebody a spiritual purpose,
77
230980
2874
以及給人一個靈性的目的,
03:53
a divine purpose to build a utopian society
78
233878
3715
一個神聖的目的, 如果這個目的能達成,
03:57
if their goals can be met,
79
237617
1921
就能建立一個烏托邦社會,
03:59
but also a sense of empowerment and adventure.
80
239562
2751
另外也有賦權和冒險的感覺。
04:02
When we look at foreign terrorist fighters,
81
242337
2043
當我們去看外國的恐怖份子鬥士,
04:04
we see young men with the wind in their hair
82
244404
2691
我們看到的是沙漠上 自由自在的年輕人,
04:07
out in the desert and women going to join them
83
247119
2546
且女人也去加入他們,
04:09
to have nuptials out in the sunset.
84
249689
2641
在日落時分舉行婚禮。
04:12
It's very romantic, and you become a hero.
85
252354
3820
那很浪漫,你變成了英雄。
04:16
For both men and women, that's the propaganda being given.
86
256198
2888
對男人和女人用的宣傳都一樣。
04:19
So what extremist groups are very good at
87
259487
2642
所以極端主義團體非常擅長的是
04:22
is taking a very complicated, confusing, nuanced world
88
262153
4826
把一個非常複雜、困惑、微妙的世界
04:27
and simplifying that world into black and white,
89
267003
3243
簡化成只有黑與白、
04:30
good and evil.
90
270270
1210
正與邪。
04:31
And you become what is good,
91
271504
1881
你會在好人的一方,
04:33
challenging what is evil.
92
273409
1855
挑戰邪惡的一方。
04:36
So I want to talk a little bit about ISIS, Daesh,
93
276361
3864
我想要談一下 ISIS、伊斯蘭國,
04:40
because they have been a game changer in how we look at these processes,
94
280249
4378
因為針對我們如何看待這些過程, 他們算是改變遊戲規則的人,
04:44
and through a lot of the material and their tactics.
95
284651
3206
他們透過很多素材和戰術來做到。
04:47
They're very much a modern movement.
96
287881
2548
他們可說是個現代的運動。
04:50
One of the aspects is the internet and the usage of social media,
97
290745
4485
其中一個面向是網際網路, 社交媒體的運用,
04:55
as we've all seen in headlines tweeting and videos of beheadings.
98
295254
4382
我們已經在關於斬首的推特頭條 以及影片中看過這現象了。
04:59
But the internet alone does not radicalize you.
99
299660
2475
但光只有網際網路 並不會讓你變激進。
05:02
The internet is a tool.
100
302159
1207
網際網路只是工具。
05:03
You don't go online shopping for shoes
101
303390
1856
你不會上網買了鞋子之後
05:05
and accidentally become a jihadist.
102
305270
1798
就突然變成聖戰士。
05:07
However, what the Internet does do is it is a catalyst.
103
307613
3389
然而,網際網路的確是種催化劑。
05:11
It provides tools and scale and rapidity
104
311026
4119
它提供工具、規模、迅速性,
05:15
that doesn't exist elsewhere.
105
315169
1508
而這些都是其他地方沒有的。
05:16
And with ISIS, all of a sudden,
106
316701
2461
在 ISIS 的例子中則是,
05:19
this idea of a cloaked, dark figure of a jihadist changed for us.
107
319186
5318
對我們來說,聖戰士的這個 披斗篷黑暗形象突然改變了。
05:24
All of a sudden, we were in their kitchens.
108
324528
2055
突然間,我們在他們的廚房裡,
05:26
We saw what they were eating for dinner.
109
326607
1999
我們看見他們晚餐吃些什麼。
他們會用推特。
05:28
They were tweeting.
110
328630
1151
05:29
We had foreign terrorist fighters tweeting in their own languages.
111
329805
3158
我們會看到外國恐怖份子鬥士 用他們自己的語言在推特。
05:32
We had women going out there talking about their wedding day,
112
332987
2952
有女人在那裡談論她們的結婚日,
談論她們孩子的出生。
05:35
about the births of their children.
113
335963
1747
05:37
We had gaming culture, all of a sudden,
114
337734
1897
突然間還有遊戲文化,
05:39
and references to Grand Theft Auto being made.
115
339655
3166
還會提及到俠盜獵車手系列。
05:43
So all of a sudden, they were homey.
116
343291
2461
所以,突然間,他們就成了自己人。
05:45
They became human.
117
345776
1151
他們變成了人類。
05:46
And the problem is that trying to counter it,
118
346951
2214
問題是,許多社交媒體公司
05:49
lots of governments and social media companies
119
349189
2310
和政府試圖對抗它的方式,
05:51
just tried to censor.
120
351523
1151
就只是做審查。
05:52
How do we get rid of terrorist content?
121
352698
1991
我們要如何除去恐怖主義的內容?
05:54
And it became a cat-and-mouse game
122
354713
1655
它變成了貓捉老鼠的遊戲,
05:56
where we would see accounts taken down and they'd just come back up,
123
356392
3204
在遊戲中,我們看到帳號 被關閉但又馬上東山再起,
05:59
and an arrogance around somebody having a 25th account
124
359620
3113
也看到有人對開了 第 25 個帳號沾沾自喜,
06:02
and material that was disseminated everywhere.
125
362757
3094
也會看到素材被到處傳播。
06:05
But we also saw a dangerous trend --
126
365875
2021
但我們也看到一個危險的趨勢──
06:07
violent extremists know the rules and regulations of social media, too.
127
367920
5008
暴力極端主義者也知道 社交媒體的規則和規定。
06:12
So we would see a banal conversation with a recruiter
128
372952
4000
所以我們可以看見與 招聘人員的平凡對話,
06:16
start on a mainstream platform,
129
376976
1973
就直接在主流平台上開始進行,
06:18
and at the point at which that conversation
130
378973
2081
在某個時點,這個對話就會
06:21
was going to become illegal,
131
381078
1340
變成非法的,
06:22
they would jump to a smaller, less regulated,
132
382442
2501
他們就會跳到一個較小、較沒管理、
06:24
more encrypted platform.
133
384967
1623
較多加密的平台。
06:26
So all of a sudden, we couldn't track where that conversation went.
134
386614
3533
突然間,我們就無法 追縱那對話到哪去了。
06:30
So this is a problem with censorship,
135
390171
1862
這就是審查制度的問題,
06:32
which is why we need to develop alternatives to censorship.
136
392057
3232
也是為何我們需要發展 審查制度以外的替代方案。
06:35
ISIS is also a game-changer because it's state-building.
137
395855
3350
ISIS 之所以能改變遊戲規則, 也是因為它在建立國家。
06:39
It's not just recruiting combatants;
138
399229
2112
它不只是在徵召戰士;
06:41
it's trying to build a state.
139
401365
1862
它是在試圖建立一個國家。
06:43
And what that means is all of a sudden,
140
403251
1940
那意味著,突然間,
06:45
your recruitment model is much more broad.
141
405215
2000
你的徵召模型就廣泛許多。
06:47
You're not just trying to get fighters --
142
407239
2049
你並不只需要戰士──
06:49
now you need architects, engineers, accountants, hackers and women.
143
409312
4266
你還需要建築師、工程師、 會計、駭客、女人。
06:53
We've actually seen a huge increase of women going
144
413602
2390
我們確實看到,在過去 24 個月,
06:56
in the last 24, but especially 12 months.
145
416016
3499
特別是過去 12 個月, 加入的女性人數大增。
06:59
Some countries, one in four of the people going over to join
146
419539
2889
在某些國家,去加入的人當中,
07:02
are now women.
147
422452
1239
四個就有一個是女性。
07:03
And so, this really changes
148
423715
1368
這真的改變了
07:05
who we're trying to counter this process with.
149
425107
2782
我們在對抗這個過程時所涉及的人。
07:08
Now, not all doom and gloom.
150
428499
1651
並非完全沒希望。
07:10
So the rest I'd like to talk about some of the positive things
151
430174
2960
所以在剩下的時間中, 我想談些正面的東西、
07:13
and the new innovation in trying to prevent and counter violent extremism.
152
433158
3844
以及在試著預防和對抗 暴力極端主義方面的創新。
07:17
Preventing is very different than countering,
153
437026
2273
預防和對抗是非常不同的,
07:19
and actually, you can think of it in medical terms.
154
439323
2556
其實,各位可以用醫學用語來看它。
07:21
So preventative medicine is,
155
441903
2222
預防性的藥是
07:24
how do we make it so you are naturally resilient
156
444149
3174
我們要如何做,才能讓你從
07:27
to this process of radicalization,
157
447347
2500
激進化的過程當中自然恢復,
07:29
whereas that is going to be different
158
449871
1862
這就不同於
07:31
if somebody is already showing a symptom or a sign
159
451757
2659
已經出現暴力極端主義 意識形態的症狀或徵兆的人。
07:34
of belonging to a violent extremist ideology.
160
454440
2881
07:37
And so in preventative measures,
161
457345
1547
所以,在預防措施上,
07:38
we're talking more about really broad groups of people
162
458916
2691
我們會更考量更廣泛的族群,
07:41
and exposure to ideas
163
461631
1817
及對想法更廣泛的接觸,
07:43
to make them resilient.
164
463472
1767
來讓他們能夠恢復。
07:45
Whereas it's very different
165
465263
1516
這非常不同於
07:46
if somebody is starting to question and agree with certain things online,
166
466803
3825
有人開始在線上執疑 並同意某些東西的情況,
07:50
and it's also very different if somebody already has a swastika tattoo
167
470652
3849
也非常不同於有人身上 已經有「卍」字刺青的情況,
07:54
and is very much embedded within a group.
168
474525
2048
及已深植在一個團體中的情況。
07:56
How do you reach them?
169
476597
1434
要如何觸及到這些人?
07:58
So I'd like to go through three examples of each one of those levels
170
478605
3682
針對這三個層級, 我會各舉一個例子。
08:02
and talk you through
171
482311
1215
帶大家了解
08:03
what some of the new ways of engaging with people are becoming.
172
483550
3316
跟這些人互動的一些 新方式變成什麼樣子。
08:07
One is "Extreme Dialogue,"
173
487194
1413
其一是「極端對話」,
08:08
and it's an educational program that we helped develop.
174
488631
3080
它是一個我們協助 發展出來的教育專案。
08:11
This one is from Canada,
175
491735
2381
這個是來自加拿大,
08:14
and it's meant to create dialogues within a classroom setting,
176
494140
4095
目的是要在教室的環境中創造對話,
08:18
using storytelling,
177
498259
1532
用說故事的方式,
08:19
because violent extremism can be very hard to try to explain,
178
499815
3151
因為試著解釋暴力 極端主義可能是很困難的,
08:22
especially to younger individuals.
179
502990
1699
特別是對年輕人解釋。
08:25
So we have a network of former extremists and survivors of extremism
180
505125
3913
所以我們有一個前極端主義者 與極端主義存活者的網路,
08:29
that tell their stories through video and create question-giving to classrooms,
181
509062
3937
透過影片來說他們的故事, 創造提出問題的教室,
08:33
to start a conversation about the topic.
182
513023
2303
來開啟和這個主題有關的對話。
08:35
These two examples show Christianne,
183
515350
2532
這兩個例子中的是克莉斯坦,
08:37
who lost her son,
184
517906
1151
她失去了兒子,
08:39
who radicalized and died fighting for ISIS,
185
519081
2493
她兒子激進化並為 ISIS 戰死,
08:41
and Daniel is a former neo-Nazi
186
521598
1667
還有前新納粹主義者丹尼爾,
08:43
who was an extremely violent neo-Nazi,
187
523289
2358
他以前是極暴力的新納粹主義者,
08:45
and they pose questions about their lives and where they're at and regret,
188
525671
4158
他們提出關於他們人生的問題, 以及他們有什麼樣的悔恨,
08:49
and force a classroom to have a dialogue around it.
189
529853
2650
迫使教室中的人 針對此事來進行對話。
08:52
Now, looking at that middle range of individuals,
190
532995
2985
看看中間範圍的個人,
08:56
actually, we need a lot of civil society voices.
191
536004
2699
其實,我們需要 很多公民社會的聲音。
08:58
How do you interact with people that are looking for information online,
192
538727
3445
你要如何和在線上找資訊的人、
09:02
that are starting to toy with an ideology,
193
542196
2342
開始半假半真地考慮 一種意識形態的人、
09:04
that are doing those searching identity questions?
194
544562
3064
及提出關於尋找身份 相關問題的人互動?
09:07
How do we provide alternatives for that?
195
547650
2142
我們要如何對此提供替代方案?
09:09
And that's when we combine large groups of civil society voices
196
549816
3390
所以我們需要結合代表 公民社會聲音的大型團體,
09:13
with creatives, techies, app developers, artists, comedians,
197
553230
4531
結合創意者、技術專家、應用程式 開發者、藝術家、喜劇演員,
09:17
and we can create really specified content
198
557785
2683
我們能創造出非常明確的內容,
09:20
and actually, online, disseminate it to very strategic audiences.
199
560492
4294
放在線上,傳播給 非常關鍵的觀眾群,
09:24
So one example would be creating a satirical video
200
564810
2803
一個例子是製作諷刺影片,
09:27
which makes fun of Islamophobia,
201
567637
2499
拿伊斯蘭恐懼症來開玩笑,
09:30
and targeting it to 15- to 20-year-olds online
202
570160
3936
目標為 15 到 20 歲的觀眾,
09:34
that have an interest in white power music
203
574120
2247
對於白人至上音樂感興趣,
09:36
and live specifically in Manchester.
204
576391
2399
且很明確是住在曼徹斯特的。
09:38
We can use these marketing tools to be very specific,
205
578814
3031
我們可以用這些行銷工具 來針對非常明確的目標,
09:41
so that we know when somebody's viewing, watching
206
581869
2723
讓我們知道何時 有人在瀏覽、在觀看,
09:44
and engaging with that content,
207
584616
1489
和內容有所連結,
09:46
it's not just the average person, it's not me or you --
208
586129
2630
並不只是一般人,不是我或你──
09:48
it's a very specific audience that we are looking to engage with.
209
588783
3107
是非常明確的觀眾, 我們想要接觸到的特定觀眾。
09:52
Even more downstream, we developed a pilot program called "One to One,"
210
592524
3699
即使在較下流處,我們也開發了 一個前導專案「一個對一個」,
09:56
where we took former extremists
211
596247
1549
我們找來前極端主義者,
09:57
and we had them reach out directly to a group of labeled neofascists
212
597820
4864
並讓他們直接去接觸被標籤為 新法西斯主義以及伊斯蘭
10:02
as well as Islamist extremists,
213
602708
1624
極端份子的團體,
10:04
and put direct messages through Facebook Messenger into their inbox, saying,
214
604356
3815
用臉書簡訊直接 發送訊息給他們,說:
10:08
"Hey, I see where you're going. I've been there.
215
608195
2286
「嘿,我知道你打算做什麼, 我有這樣的經驗。
10:10
If you want to talk, I'm here."
216
610505
1542
如果你想談談,我在這裡。」
10:12
Now, we kind of expected death threats from this sort of interaction.
217
612071
3254
我們也預期這類互動 會帶來死亡威脅。
10:15
It's a little alarming to have a former neo-Nazi say, "Hey, how are you?"
218
615349
4420
聽到一位前新納粹對你說 「嘿,你好嗎?」
是還蠻讓人不安的。
10:19
But actually, we found that around 60 percent
219
619793
2207
但其實我們發現,
我們接觸的人中, 約 60% 的人會做出回應,
10:22
of the people reached out to responded,
220
622024
2554
10:24
and of that, around another 60 percent had sustained engagement,
221
624602
4085
這些人當中,又有 60% 有持續接觸下去,
10:28
meaning that they were having conversations
222
628711
2056
也就是說他們繼續和 最難接觸到的人進行交談,
10:30
with the hardest people to reach about what they were going through,
223
630791
3216
在談他們的經歷,
種下懷疑的種子,
10:34
planting seeds of doubt
224
634031
1151
10:35
and giving them alternatives for talking about these subjects,
225
635206
2992
提供他們可以談論 這些主題的替代方案,
10:38
and that's really important.
226
638222
1350
那是非常重要的。
10:40
So what we're trying to do
227
640881
2223
所以我們試圖做的,
10:43
is actually bring unlikely sectors to the table.
228
643128
2905
其實是去幫助不太可能的部門。
10:46
We have amazing activists all over the world,
229
646057
2326
在全世界,我們有很棒的活躍份子,
10:48
but oftentimes, their messages are not strategic
230
648407
2366
但通常,他們的訊息不是策略性的,
10:50
or they don't actually reach the audiences they want to reach.
231
650797
2906
或者他們其實不會接觸到 他們想要接觸的觀眾,
10:53
So we work with networks of former extremists.
232
653727
2239
所以我們和前極端份子的網路合作。
10:55
We work with networks of young people in different parts of the world.
233
655990
3429
我們與世界不同區域的 年輕人的網路合作。
我們和他們合作來提供技術部門,
10:59
And we work with them to bring the tech sector to the table
234
659443
2770
11:02
with artists and creatives and marketing expertise
235
662237
2818
配合藝術家、創意者、行銷專家,
11:05
so that we can actually have a more robust and challenging of extremism
236
665079
5001
這樣我們就能有比較強大 且有挑戰性的極端主義,
11:10
that works together.
237
670104
1300
能夠一同合作。
11:11
So I would say that if you are in the audience
238
671894
2580
我會說,如果你是觀眾之一,
11:14
and you happen to be a graphic designer,
239
674498
2699
且你剛好是平面設計師、
11:17
a poet, a marketing expert,
240
677221
2182
詩人、行銷專家、
11:19
somebody that works in PR,
241
679427
1909
在公關部門工作的人、
11:21
a comedian --
242
681360
1353
喜劇演員──
11:22
you might not think that this is your sector,
243
682737
2151
你可能不覺得這是你的部門,
11:24
but actually, the skills that you have right now
244
684912
2739
但事實上,你現在擁有的技能
11:27
might be exactly what is needed
245
687675
2003
可能就是協助有效地挑戰
11:29
to help challenge extremism effectively.
246
689702
2309
極端主義所需要的技能。
11:32
Thank you.
247
692035
1151
謝謝。
11:33
(Applause)
248
693210
4213
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7