How technology can fight extremism and online harassment | Yasmin Green
75,368 views ・ 2018-06-27
請雙擊下方英文字幕播放視頻。
譯者: Lilian Chiu
審譯者: Yanyan Hong
00:13
My relationship with the internet
reminds me of the setup
0
13131
4411
我與網際網路的關係,
讓我想起老套恐怖片的情境。
00:17
to a clichéd horror movie.
1
17566
1833
00:19
You know, the blissfully happy family
moves in to their perfect new home,
2
19867
4386
幸福快樂的家庭,
搬進一間美好的新房子,
00:24
excited about their perfect future,
3
24277
2281
興奮期待完美的未來,
00:26
and it's sunny outside
and the birds are chirping ...
4
26582
3521
外頭陽光普照,鳥兒在唱歌……
00:30
And then it gets dark.
5
30857
1839
接著電影就變黑暗了。
00:32
And there are noises from the attic.
6
32720
2348
閣樓傳出噪音。
00:35
And we realize that that perfect
new house isn't so perfect.
7
35092
4345
我們發現,那間美好的
新房子並沒有那麼美好。
00:40
When I started working at Google in 2006,
8
40485
3131
2006 年,當我開始
在谷歌(Google)工作時,
00:43
Facebook was just a two-year-old,
9
43640
1767
臉書(Facebook)才剛推出兩年,
00:45
and Twitter hadn't yet been born.
10
45431
2012
推特(Twitter)甚至還沒問世。
00:47
And I was in absolute awe
of the internet and all of its promise
11
47848
4410
我對網際網路及它所有的承諾
感到絕對的敬畏,
00:52
to make us closer
12
52282
1437
它承諾要讓我們
00:53
and smarter
13
53743
1296
更靠近且更聰明,
00:55
and more free.
14
55063
1214
還有給予更多自由。
00:57
But as we were doing the inspiring work
of building search engines
15
57265
3714
但當我們開始進行
這鼓舞人心的工作,
建立搜尋引擎,
01:01
and video-sharing sites
and social networks,
16
61003
2886
建立影片分享網站和社交網路,
01:04
criminals, dictators and terrorists
were figuring out
17
64907
4304
罪犯、獨裁者,
及恐怖分子都在想辦法
01:09
how to use those same
platforms against us.
18
69235
3202
如何用同樣的平台來對抗我們。
01:13
And we didn't have
the foresight to stop them.
19
73417
2455
我們沒有先見之明來阻止他們。
01:16
Over the last few years, geopolitical
forces have come online to wreak havoc.
20
76746
5099
在過去幾年,地緣政治學的
勢力也上網展開大破壞。
01:21
And in response,
21
81869
1169
造成的反應是,
01:23
Google supported a few colleagues and me
to set up a new group called Jigsaw,
22
83062
4778
谷歌支持我和幾位同事,
成立一個小組,叫做「Jigsaw」,
01:27
with a mandate to make people safer
from threats like violent extremism,
23
87864
4596
我們的使命是要讓大家更安全,
避免受到像是極端主義、
01:32
censorship, persecution --
24
92484
2078
審查制度、迫害的威脅——
01:35
threats that feel very personal to me
because I was born in Iran,
25
95186
4117
我個人對這些威脅特別有感,
因為我是在伊朗出生的,
01:39
and I left in the aftermath
of a violent revolution.
26
99327
2929
在一場暴力革命的餘波中,
我被迫離開了伊朗。
01:43
But I've come to realize
that even if we had all of the resources
27
103525
4346
但我漸漸了解到,
即使我們有所有的資源,
01:47
of all of the technology
companies in the world,
28
107895
2858
有全世界所有的科技公司,
01:51
we'd still fail
29
111595
1230
如果我們忽略了
01:53
if we overlooked one critical ingredient:
30
113586
2948
一項關鍵因素,我們仍然會失敗:
01:57
the human experiences of the victims
and perpetrators of those threats.
31
117653
5789
那些威脅的受害者
與加害者的人類經歷。
02:04
There are many challenges
I could talk to you about today.
32
124935
2736
今天我其實可以
與各位談很多的挑戰。
02:07
I'm going to focus on just two.
33
127695
1504
但我只打算著重兩項:
02:09
The first is terrorism.
34
129623
2079
第一是恐怖主義。
02:13
So in order to understand
the radicalization process,
35
133563
2557
為了要了解激進化的過程,
02:16
we met with dozens of former members
of violent extremist groups.
36
136144
4287
我們和數十名暴力
極端主義團體的前成員見面。
02:21
One was a British schoolgirl,
37
141590
2483
其中一位是英國的女學生,
02:25
who had been taken off of a plane
at London Heathrow
38
145049
3699
她曾在倫敦希斯洛機場
被強拉下飛機,
02:28
as she was trying to make her way
to Syria to join ISIS.
39
148772
4692
因為當時她打算
去敘利亞加入伊斯蘭國。
02:34
And she was 13 years old.
40
154281
1931
她當時只有十三歲。
02:37
So I sat down with her and her father,
and I said, "Why?"
41
157792
4625
我和她及她父親坐下來談,
我說:「為什麼?」
02:42
And she said,
42
162441
1717
她說:
02:44
"I was looking at pictures
of what life is like in Syria,
43
164182
3639
「我在看一些敘利亞
生活寫照的圖片,
02:47
and I thought I was going to go
and live in the Islamic Disney World."
44
167845
3510
我以為我是要去住到
伊斯蘭的迪士尼樂園。」
02:52
That's what she saw in ISIS.
45
172527
2084
這是她看到的伊斯蘭國。
02:54
She thought she'd meet and marry
a jihadi Brad Pitt
46
174635
3492
她以為她要去見一位聖戰士中的
布萊德彼得並嫁給他,
02:58
and go shopping in the mall all day
and live happily ever after.
47
178151
3058
整天都能去購物,
從此幸福快樂地生活。
03:02
ISIS understands what drives people,
48
182977
2824
伊斯蘭國知道什麼能驅使人,
03:05
and they carefully craft a message
for each audience.
49
185825
3544
他們會為每一位觀眾
精心策劃一則訊息。
光是去看看他們把他們的行銷素材
03:11
Just look at how many languages
50
191122
1511
03:12
they translate their
marketing material into.
51
192657
2273
翻譯成多少語言,就能了解了。
03:15
They make pamphlets,
radio shows and videos
52
195677
2661
他們會製作小冊子、
廣播節目,和影片,
03:18
in not just English and Arabic,
53
198362
1973
不只用英語和阿拉伯語,
03:20
but German, Russian,
French, Turkish, Kurdish,
54
200359
4767
還有德語、俄語、法語、
土耳其語、庫德語、
03:25
Hebrew,
55
205150
1672
希伯來語、
03:26
Mandarin Chinese.
56
206846
1741
華語(中文)。
03:29
I've even seen an ISIS-produced
video in sign language.
57
209309
4192
我甚至看過一支伊斯蘭國
製作的影片是用手語的。
03:34
Just think about that for a second:
58
214605
1884
花點時間思考一下:
03:36
ISIS took the time and made the effort
59
216513
2308
伊斯蘭國投入時間和心力,
03:38
to ensure their message is reaching
the deaf and hard of hearing.
60
218845
3804
來確保他們的訊息
能夠傳達給聽障人士。
03:45
It's actually not tech-savviness
61
225143
2144
伊斯蘭國能贏得人心和人信,
03:47
that is the reason why
ISIS wins hearts and minds.
62
227311
2595
並不是因為他們很精通科技。
03:49
It's their insight into the prejudices,
the vulnerabilities, the desires
63
229930
4163
因為他們有洞見,了解
他們試圖接觸的人有什麼
03:54
of the people they're trying to reach
64
234117
1774
偏見、脆弱、慾望,
03:55
that does that.
65
235915
1161
才能做到這一點。
03:57
That's why it's not enough
66
237718
1429
那就說明了為什麼
03:59
for the online platforms
to focus on removing recruiting material.
67
239171
4239
線上平台只把焦點放在
移除召募素材是不足的。
04:04
If we want to have a shot
at building meaningful technology
68
244518
3581
如果我想要有機會建立
一種有意義的技術,
04:08
that's going to counter radicalization,
69
248123
1874
用它來對抗極端化,
04:10
we have to start with the human
journey at its core.
70
250021
2979
我們就得要從它核心的
人類旅程開始著手。
04:13
So we went to Iraq
71
253884
2187
所以,我們去了伊拉克,
04:16
to speak to young men
who'd bought into ISIS's promise
72
256095
2831
去和年輕人交談,
我們找的對象曾相信伊斯蘭國
04:18
of heroism and righteousness,
73
258950
3191
所做的關於英雄主義與公正的承諾,
04:22
who'd taken up arms to fight for them
74
262165
1847
曾拿起武器為他們作戰,
04:24
and then who'd defected
75
264036
1338
接著,在目擊了
04:25
after they witnessed
the brutality of ISIS's rule.
76
265398
3021
伊斯蘭國統治的殘酷之後選擇變節。
04:28
And I'm sitting there in this makeshift
prison in the north of Iraq
77
268880
3192
我坐在北伊拉克的一間臨時監獄裡,
04:32
with this 23-year-old who had actually
trained as a suicide bomber
78
272096
4550
會見一位在變節前
受過訓練的自殺炸彈客,
04:36
before defecting.
79
276670
1552
年僅 23 歲。
04:39
And he says,
80
279080
1158
他說:
04:41
"I arrived in Syria full of hope,
81
281119
3220
「我到敘利亞時滿懷著希望,
04:44
and immediately, I had two
of my prized possessions confiscated:
82
284363
4365
可一下子我兩項最重要的
東西就被沒收了:
04:48
my passport and my mobile phone."
83
288752
2933
我的護照和我的手機。」
04:52
The symbols of his physical
and digital liberty
84
292140
2406
在他抵達時,
這兩樣象徵他實體自由
04:54
were taken away from him on arrival.
85
294570
1760
與數位自由的東西被奪去了。
04:57
And then this is the way he described
that moment of loss to me.
86
297248
3510
接著,他這樣向我描述迷失的時刻。
05:01
He said,
87
301356
1586
他說:
05:02
"You know in 'Tom and Jerry,'
88
302966
2329
「在《湯姆貓與傑利鼠》中,
05:06
when Jerry wants to escape,
and then Tom locks the door
89
306192
3103
當傑利想要逃脫時,湯姆把門鎖住,
05:09
and swallows the key
90
309319
1156
把鑰匙吞掉,
05:10
and you see it bulging out
of his throat as it travels down?"
91
310499
3551
還可以從外表形狀看到鑰匙
延著喉嚨下滑,記得嗎?」
05:14
And of course, I really could see
the image that he was describing,
92
314446
3153
當然,我能看見他所描述的畫面,
05:17
and I really did connect with the feeling
that he was trying to convey,
93
317623
3661
我真的能和他試圖傳達的
這種感受產生連結,
05:21
which was one of doom,
94
321308
2021
一種在劫難逃的感受,
05:23
when you know there's no way out.
95
323353
1789
你知道沒有路可逃了。
05:26
And I was wondering:
96
326551
1289
而我很納悶:
05:28
What, if anything,
could have changed his mind
97
328644
2682
在他離家的那一天,如果有的話,
05:31
the day that he left home?
98
331350
1240
什麼能改變他的心意?
05:32
So I asked,
99
332614
1250
於是,我問:
05:33
"If you knew everything that you know now
100
333888
3178
「如果你當時知道
你現在知道的這些
05:37
about the suffering
and the corruption, the brutality --
101
337090
3051
關於苦難、腐敗、殘酷的狀況——
05:40
that day you left home,
102
340165
1415
在離家的那天就知道,
05:41
would you still have gone?"
103
341604
1679
你還會選擇離開嗎?」
05:43
And he said, "Yes."
104
343786
1711
他說:「會。」
05:45
And I thought, "Holy crap, he said 'Yes.'"
105
345846
2282
我心想:「老天爺,他說『會』。」
05:48
And then he said,
106
348694
1219
接著他說:
05:49
"At that point, I was so brainwashed,
107
349937
3001
「在那個時點,我完全被洗腦了,
05:52
I wasn't taking in
any contradictory information.
108
352962
3244
我不會接受任何有所矛盾的資訊。
05:56
I couldn't have been swayed."
109
356744
1555
我當時不可能被動搖。」
05:59
"Well, what if you knew
everything that you know now
110
359235
2527
「那麼如果你在你離開前
六個月就已經知道
06:01
six months before the day that you left?"
111
361786
2098
你現在知道的這些,結果會如何?」
06:05
"At that point, I think it probably
would have changed my mind."
112
365345
3131
「若在那個時點,
我想我可能會改變心意。」
06:10
Radicalization isn't
this yes-or-no choice.
113
370138
3397
極端化並不是關於是非的選擇。
06:14
It's a process, during which
people have questions --
114
374007
2977
它是一個過程,在這過程中,
人們會有問題——
06:17
about ideology, religion,
the living conditions.
115
377008
3776
關於意識型態、宗教、
生活條件的問題。
06:20
And they're coming online for answers,
116
380808
2766
他們會上網尋求答案,
06:23
which is an opportunity to reach them.
117
383598
1917
這就是一個接觸他們的機會。
06:25
And there are videos online
from people who have answers --
118
385905
3014
有答案的人會在網路提供影片——
06:28
defectors, for example,
telling the story of their journey
119
388943
2876
比如,叛逃者訴說他們
投入和脫離暴力的心路歷程;
06:31
into and out of violence;
120
391843
1583
06:33
stories like the one from that man
I met in the Iraqi prison.
121
393450
3487
就像我在伊拉克監獄見到的
那名男子告訴我的故事。
06:37
There are locals who've uploaded
cell phone footage
122
397914
2590
有當地人會上傳手機影片,
06:40
of what life is really like
in the caliphate under ISIS's rule.
123
400528
3503
呈現在伊斯蘭國統治之下
穆斯林國的真實生活樣貌。
06:44
There are clerics who are sharing
peaceful interpretations of Islam.
124
404055
3735
有教會聖職人員分享
關於伊斯蘭的和平詮釋。
06:48
But you know what?
125
408830
1150
但你們知道嗎?
06:50
These people don't generally have
the marketing prowess of ISIS.
126
410004
3020
這些人通常都沒有
伊斯蘭國的高超行銷本領。
06:54
They risk their lives to speak up
and confront terrorist propaganda,
127
414049
4532
他們冒著生命危險說出來,
和恐怖主義的宣傳對質,
06:58
and then they tragically
don't reach the people
128
418605
2211
但很不幸的是,他們無法接觸到
07:00
who most need to hear from them.
129
420840
1682
最需要聽到他們聲音的人。
07:03
And we wanted to see
if technology could change that.
130
423173
2612
我們想試看看,
科技是否能改變這一點。
07:06
So in 2016, we partnered with Moonshot CVE
131
426205
4183
2016 年,我們和 Moonshot CVE
(信息泄露組織)合作,
07:10
to pilot a new approach
to countering radicalization
132
430412
3180
試驗一種對抗極端化的新方法,
07:13
called the "Redirect Method."
133
433616
1780
叫做「重新定向法」。
07:16
It uses the power of online advertising
134
436453
3012
它用線上廣告的力量,
07:19
to bridge the gap between
those susceptible to ISIS's messaging
135
439489
4514
在容易受到伊斯蘭國訊息影響的人
與揭發那些訊息真面目的
可靠聲音之間搭起橋樑。
07:24
and those credible voices
that are debunking that messaging.
136
444027
3760
07:28
And it works like this:
137
448633
1150
它的運作方式如下:
07:29
someone looking for extremist material --
138
449807
1961
有人在尋找極端主義的素材——
07:31
say they search
for "How do I join ISIS?" --
139
451792
2990
比如他們搜尋
「如何加入伊斯蘭國?」——
07:34
will see an ad appear
140
454806
2476
就會有一則廣告出現,
07:37
that invites them to watch a YouTube video
of a cleric, of a defector --
141
457306
4882
邀請他們上 YouTube,
看聖職人員、變節者的影片——
07:42
someone who has an authentic answer.
142
462212
2310
有真實答案的人所拍的影片。
07:44
And that targeting is based
not on a profile of who they are,
143
464546
3623
這個方法鎖定目標對象的方式
不是依據個人資料,
07:48
but of determining something
that's directly relevant
144
468193
3053
而是由與他們的查詢或問題有直接
07:51
to their query or question.
145
471270
1708
相關的東西來決定。
07:54
During our eight-week pilot
in English and Arabic,
146
474122
2842
我們用英語和阿拉伯語
做了八週的測試,
07:56
we reached over 300,000 people
147
476988
3279
接觸到了超過三十萬人,
08:00
who had expressed an interest in
or sympathy towards a jihadi group.
148
480291
5545
他們都是對聖戰團體
表示感興趣或同情的人。
08:06
These people were now watching videos
149
486626
2264
現在這些人在看的影片
08:08
that could prevent them
from making devastating choices.
150
488914
3340
能預防他們做出毀滅性的選擇。
08:13
And because violent extremism
isn't confined to any one language,
151
493405
3727
因為暴力極端主義
不侷限於任何一種語言、
08:17
religion or ideology,
152
497156
1804
宗教,或意識形態,
08:18
the Redirect Method is now
being deployed globally
153
498984
3501
「重新定向法」現已在全球實施,
08:22
to protect people being courted online
by violent ideologues,
154
502509
3804
保護大家上網時不會受到
暴力意識形態的引誘,
08:26
whether they're Islamists,
white supremacists
155
506337
2596
不論是伊斯蘭教的、
白人至上主義的,
08:28
or other violent extremists,
156
508957
2103
或其他暴力極端主義的,
08:31
with the goal of giving them the chance
to hear from someone
157
511084
2873
我們的目標是要
給他們機會去聽聽看
08:33
on the other side of that journey;
158
513981
2091
在那旅程另一端的人怎麼說;
08:36
to give them the chance to choose
a different path.
159
516096
2839
給他們機會去選擇不同的路。
08:40
It turns out that often the bad guys
are good at exploiting the internet,
160
520749
5980
事實證明,
壞人通常擅長利用網際網路,
08:46
not because they're some kind
of technological geniuses,
161
526753
3744
並不是因為他們是什麼科技天才,
08:50
but because they understand
what makes people tick.
162
530521
2985
而是因為他們了解人的癢處。
08:54
I want to give you a second example:
163
534855
2369
我再舉個例子說明:
08:58
online harassment.
164
538019
1391
線上騷擾。
09:00
Online harassers also work
to figure out what will resonate
165
540629
3363
線上騷擾者也在致力於
找出什麼能讓
09:04
with another human being.
166
544016
1615
另一個人產生共鳴。
09:05
But not to recruit them like ISIS does,
167
545655
3110
但他們的目的不像
伊斯蘭國是要招募人,
09:08
but to cause them pain.
168
548789
1275
而是造成別人痛苦。
09:11
Imagine this:
169
551259
1342
想像這個狀況:
09:13
you're a woman,
170
553347
1659
你是一名女子,
09:15
you're married,
171
555030
1413
已婚,
09:16
you have a kid.
172
556467
1154
有一個孩子。
09:18
You post something on social media,
173
558834
1784
你在社交媒體上發了一篇文,
09:20
and in a reply,
you're told that you'll be raped,
174
560642
2886
你得到一則回應,說你會被強暴,
09:24
that your son will be watching,
175
564577
1560
你的兒子會被監視,
09:26
details of when and where.
176
566825
1856
還有時間和地點的細節資訊。
09:29
In fact, your home address
is put online for everyone to see.
177
569148
3143
事實上,在網路上大家
都能看到你家的地址。
09:33
That feels like a pretty real threat.
178
573580
2007
那威脅感覺十分真實。
09:37
Do you think you'd go home?
179
577113
1656
你認為你會回家嗎?
09:39
Do you think you'd continue doing
the thing that you were doing?
180
579999
3048
你認為你會繼續做你正在做的事嗎?
09:43
Would you continue doing that thing
that's irritating your attacker?
181
583071
3220
你會繼續做那件惹惱了
攻擊你的人的那件事嗎?
09:48
Online abuse has been this perverse art
182
588016
3096
線上虐待一直都是種
刻意作對的藝術,
09:51
of figuring out what makes people angry,
183
591136
3468
找出什麼能讓人生氣,
09:54
what makes people afraid,
184
594628
2132
什麼能讓人害怕,
09:56
what makes people insecure,
185
596784
1641
什麼能讓人沒有安全感,
09:58
and then pushing those pressure points
until they're silenced.
186
598449
3067
接著去壓那些對壓力敏感之處,
直到它們被壓制下來。
10:02
When online harassment goes unchecked,
187
602333
2304
當線上騷擾不受管束時,
10:04
free speech is stifled.
188
604661
1667
自由言論就會被扼殺。
10:07
And even the people
hosting the conversation
189
607196
2127
即使主持對話的人
10:09
throw up their arms and call it quits,
190
609347
1834
棄械並宣佈到此為止,
10:11
closing their comment sections
and their forums altogether.
191
611205
2957
把他們的留言區以及論壇都關閉。
10:14
That means we're actually
losing spaces online
192
614186
2849
那意味著,我們其實正在失去線上
10:17
to meet and exchange ideas.
193
617059
1987
可以碰面交換點子的空間。
10:19
And where online spaces remain,
194
619939
2163
在還有線上空間的地方,
10:22
we descend into echo chambers
with people who think just like us.
195
622126
4470
我們會陷入到迴音室當中,
只和相同想法的人聚在一起。
10:27
But that enables
the spread of disinformation;
196
627688
2499
但那會造成錯誤訊息被散佈出去;
10:30
that facilitates polarization.
197
630211
2184
那會促成兩極化。
10:34
What if technology instead
could enable empathy at scale?
198
634508
5269
但如果反之能用科技
來大量產生同理心呢?
10:40
This was the question
that motivated our partnership
199
640451
2486
就是這個問題
驅使我們和谷歌的反虐待小組、
10:42
with Google's Counter Abuse team,
200
642961
1819
10:44
Wikipedia
201
644804
1178
維基,
10:46
and newspapers like the New York Times.
202
646006
1934
以及像紐約時報這類報紙合作。
10:47
We wanted to see if we could build
machine-learning models
203
647964
2876
我們想要看看我們
是否能建立出能夠了解
10:50
that could understand
the emotional impact of language.
204
650864
3606
語言會帶來什麼情緒影響的
機器學習模型,
10:55
Could we predict which comments
were likely to make someone else leave
205
655062
3610
我們能否預測什麼樣的意見
有可能會讓另一個人
10:58
the online conversation?
206
658696
1374
離開線上對談?
11:00
And that's no mean feat.
207
660515
3887
那不是容易的事。
11:04
That's no trivial accomplishment
208
664426
1566
人工智慧要做到
11:06
for AI to be able to do
something like that.
209
666016
2563
那樣的事,並不是理所當然。
11:08
I mean, just consider
these two examples of messages
210
668603
3729
我是指,想想這兩個例子,
都是在上週我有可能
會收到的訊息。
11:12
that could have been sent to me last week.
211
672356
2224
「祝在 TED 順利!」
(直譯:在 TED 斷一條腿。)
11:15
"Break a leg at TED!"
212
675517
1879
11:17
... and
213
677420
1164
以及
11:18
"I'll break your legs at TED."
214
678608
2126
「我會在 TED 打斷你一條腿。」
11:20
(Laughter)
215
680758
1246
(笑聲)
11:22
You are human,
216
682028
1513
你們是人,
11:23
that's why that's an obvious
difference to you,
217
683565
2210
那就是為何你們能明顯看出,
11:25
even though the words
are pretty much the same.
218
685799
2224
用字幾乎相同的兩個句子有何差別。
11:28
But for AI, it takes some training
to teach the models
219
688047
3079
但對人工智慧來說,
要透過訓練來教導模型
11:31
to recognize that difference.
220
691150
1571
去辨視那差別。
11:32
The beauty of building AI
that can tell the difference
221
692745
3245
建立能夠分辨出那差別的
人工智慧,有個美好之處,
11:36
is that AI can then scale to the size
of the online toxicity phenomenon,
222
696014
5050
就是人工智慧能處理
線上毒素現象的規模,
11:41
and that was our goal in building
our technology called Perspective.
223
701088
3287
為此目的,我們建立了
一種出名為「觀點」的技術。
11:45
With the help of Perspective,
224
705056
1427
在「觀點」的協助下,
11:46
the New York Times, for example,
225
706507
1583
以紐約時報為例,
11:48
has increased spaces
online for conversation.
226
708114
2487
他們增加了線上交談的空間。
11:51
Before our collaboration,
227
711005
1310
在與我們合作之前,
11:52
they only had comments enabled
on just 10 percent of their articles.
228
712339
4305
他們的文章只有
大約 10% 有開放留言。
11:57
With the help of machine learning,
229
717495
1644
在機器學習的協助下,
11:59
they have that number up to 30 percent.
230
719163
1897
這個數字提升到了 30%。
12:01
So they've tripled it,
231
721084
1156
增加了三倍,
12:02
and we're still just getting started.
232
722264
1917
且我們才剛開始合作而已。
12:04
But this is about way more than just
making moderators more efficient.
233
724872
3461
這絕對不只是讓版主更有效率。
12:10
Right now I can see you,
234
730076
1850
現在,我可以看見你們,
12:11
and I can gauge how what I'm saying
is landing with you.
235
731950
3294
我可以估量我所說的話
會如何對你們產生影響。
12:16
You don't have that opportunity online.
236
736370
1879
在線上沒有這樣的機會。
12:18
Imagine if machine learning
could give commenters,
237
738558
3635
想像一下,
當留言者在打字的時候,
如果機器學習能夠
12:22
as they're typing,
238
742217
1162
12:23
real-time feedback about how
their words might land,
239
743403
3347
即使給他們回饋,
說明他們的文字會造成什麼影響,
12:27
just like facial expressions do
in a face-to-face conversation.
240
747609
3024
就像在面對面交談時,
面部表情的功能。
12:32
Machine learning isn't perfect,
241
752926
1842
機器學習並不完美,
12:34
and it still makes plenty of mistakes.
242
754792
2394
它仍然會犯許多錯誤。
12:37
But if we can build technology
243
757210
1557
但如果我們能建立出
12:38
that understands the emotional
impact of language,
244
758791
3293
能了解語言有什麼
情緒影響力的技術,
12:42
we can build empathy.
245
762108
1460
我們就能建立同理心。
12:43
That means that we can have
dialogue between people
246
763592
2425
那就表示,我們能讓兩個人對話,
12:46
with different politics,
247
766041
1816
即使他們政治立場不同,
12:47
different worldviews,
248
767881
1216
世界觀不同,
12:49
different values.
249
769121
1246
價值觀不同。
12:51
And we can reinvigorate the spaces online
that most of us have given up on.
250
771359
4775
我們能讓大部分人已經放棄的
線上空間再度復甦。
12:57
When people use technology
to exploit and harm others,
251
777857
3785
當人們用科技來利用和傷害他人時,
13:01
they're preying on our human fears
and vulnerabilities.
252
781666
3642
他們靠的是我們人類的恐懼和脆弱。
13:06
If we ever thought
that we could build an internet
253
786461
3508
如果我們認為我們能夠建立一個完全
13:09
insulated from the dark side of humanity,
254
789993
2578
沒有人性黑暗面的網際網路,
13:12
we were wrong.
255
792595
1184
我們就錯了。
13:14
If we want today to build technology
256
794361
2270
如果現今我們想要建立技術
13:16
that can overcome
the challenges that we face,
257
796655
3127
來克服我們面臨的挑戰,
13:19
we have to throw our entire selves
into understanding the issues
258
799806
4043
我們就得把自己全心全意投入,
並去了解這些議題,
13:23
and into building solutions
259
803873
1893
去建立人類的解決方案,
13:25
that are as human as the problems
they aim to solve.
260
805790
3782
來解決人類的問題。
13:30
Let's make that happen.
261
810071
1513
讓我們來實現它吧。
13:31
Thank you.
262
811924
1150
謝謝。
13:33
(Applause)
263
813098
3277
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。