How technology can fight extremism and online harassment | Yasmin Green

75,368 views ・ 2018-06-27

TED


请双击下面的英文字幕来播放视频。

翻译人员: Phyllis Lin 校对人员: 睿哲 王
00:13
My relationship with the internet reminds me of the setup
0
13131
4411
我与互联网的关系,使我想起
恐怖片的老套情节。
00:17
to a clichéd horror movie.
1
17566
1833
00:19
You know, the blissfully happy family moves in to their perfect new home,
2
19867
4386
幸福快乐的家庭乔迁新居,
00:24
excited about their perfect future,
3
24277
2281
憧憬着他们美好的未来,
00:26
and it's sunny outside and the birds are chirping ...
4
26582
3521
窗外阳光明媚,鸟儿嘤嘤成韵...
00:30
And then it gets dark.
5
30857
1839
突然,电影的气氛变得阴暗起来。
00:32
And there are noises from the attic.
6
32720
2348
阁楼传来阵阵嘈杂的声响。
00:35
And we realize that that perfect new house isn't so perfect.
7
35092
4345
我们这才意识到,那间美好的 新房子,其实问题重重。
00:40
When I started working at Google in 2006,
8
40485
3131
2006年,当我开始在谷歌工作的时候,
00:43
Facebook was just a two-year-old,
9
43640
1767
脸书刚刚推出两年,
00:45
and Twitter hadn't yet been born.
10
45431
2012
推特还未问世。
00:47
And I was in absolute awe of the internet and all of its promise
11
47848
4410
我对互联网和它给予的承诺充满赞叹。
它承诺让我们
关系更亲密,变得更聪明,
00:52
to make us closer
12
52282
1437
00:53
and smarter
13
53743
1296
00:55
and more free.
14
55063
1214
并给予我们更多自由。
00:57
But as we were doing the inspiring work of building search engines
15
57265
3714
但是当我们开始进行这项 鼓舞人心的工作,开发搜索引擎,
视频分享和社交网络,
01:01
and video-sharing sites and social networks,
16
61003
2886
01:04
criminals, dictators and terrorists were figuring out
17
64907
4304
罪犯,独裁者和恐怖分子,
也找到了相同的平台,来攻击我们。
01:09
how to use those same platforms against us.
18
69235
3202
01:13
And we didn't have the foresight to stop them.
19
73417
2455
我们并未预见到这一点, 因而对此毫无抵抗力。
01:16
Over the last few years, geopolitical forces have come online to wreak havoc.
20
76746
5099
过去几年里,地缘政治势力 开始在网络上肆虐。
01:21
And in response,
21
81869
1169
作为反击,
01:23
Google supported a few colleagues and me to set up a new group called Jigsaw,
22
83062
4778
在谷歌的支持下,我和几位同事 成了一个名叫“Jigsaw”的新组织,
01:27
with a mandate to make people safer from threats like violent extremism,
23
87864
4596
我们肩负重任,去维护人们的安全, 让大众免受暴力极端主义,
01:32
censorship, persecution --
24
92484
2078
审查制度和迫害的威胁——
我对这些威胁深有体会, 因为我在伊朗出生,
01:35
threats that feel very personal to me because I was born in Iran,
25
95186
4117
01:39
and I left in the aftermath of a violent revolution.
26
99327
2929
在暴力革命的余波中,我离开了伊朗。
01:43
But I've come to realize that even if we had all of the resources
27
103525
4346
但我逐渐意识到,
即便我们拥有世界上 所有科技公司的资源,
01:47
of all of the technology companies in the world,
28
107895
2858
01:51
we'd still fail
29
111595
1230
我们仍然会失败,
01:53
if we overlooked one critical ingredient:
30
113586
2948
如果我们忽视那个决定性的因素:
01:57
the human experiences of the victims and perpetrators of those threats.
31
117653
5789
带来这些威胁的犯罪分子 和受害群众背后的经历。
02:04
There are many challenges I could talk to you about today.
32
124935
2736
我今天可以和各位谈很多 我们所面临的挑战。
不过我打算只着重于两点。
02:07
I'm going to focus on just two.
33
127695
1504
02:09
The first is terrorism.
34
129623
2079
其一是恐怖主义。
02:13
So in order to understand the radicalization process,
35
133563
2557
为了了解那种激进的过程,
02:16
we met with dozens of former members of violent extremist groups.
36
136144
4287
我们同数十名暴力极端主义的 前成员进行了接触。
02:21
One was a British schoolgirl,
37
141590
2483
其中有一位英国的女学生,
02:25
who had been taken off of a plane at London Heathrow
38
145049
3699
在伦敦希思罗机场,被迫下飞机,
02:28
as she was trying to make her way to Syria to join ISIS.
39
148772
4692
因为她打算到叙利亚加入ISIS组织。
02:34
And she was 13 years old.
40
154281
1931
当时她只有13岁。
02:37
So I sat down with her and her father, and I said, "Why?"
41
157792
4625
我同她和她父亲坐下来谈话, 我问到:“你为什么要这么做?”
02:42
And she said,
42
162441
1717
她说:
02:44
"I was looking at pictures of what life is like in Syria,
43
164182
3639
“我看到了叙利亚的生活写照,
02:47
and I thought I was going to go and live in the Islamic Disney World."
44
167845
3510
我以为我会住进伊斯兰的迪斯尼乐园。”
02:52
That's what she saw in ISIS.
45
172527
2084
这是她所看到ISIS组织。
02:54
She thought she'd meet and marry a jihadi Brad Pitt
46
174635
3492
她觉得,她会邂逅并嫁给 圣战分子中的布拉德 · 皮特,
02:58
and go shopping in the mall all day and live happily ever after.
47
178151
3058
整天在商场购物, 从此过上幸福快乐的生活。
03:02
ISIS understands what drives people,
48
182977
2824
ISIS组织对于驾驭人心了如指掌,
03:05
and they carefully craft a message for each audience.
49
185825
3544
他们为每一位聆听者 精心策划了要传递的信息。
03:11
Just look at how many languages
50
191122
1511
光看他们的营销材料
03:12
they translate their marketing material into.
51
192657
2273
被译成了多少种语言,便能一目了然。
03:15
They make pamphlets, radio shows and videos
52
195677
2661
他们还制作小册子,广播节目和影片,
03:18
in not just English and Arabic,
53
198362
1973
不但有英语和阿拉伯语,
03:20
but German, Russian, French, Turkish, Kurdish,
54
200359
4767
还有德语,俄语,法语, 土耳其语,库尔德语,
03:25
Hebrew,
55
205150
1672
希伯来语
03:26
Mandarin Chinese.
56
206846
1741
和汉语。
03:29
I've even seen an ISIS-produced video in sign language.
57
209309
4192
我甚至看到了ISIS组织 制作的手语视频。
03:34
Just think about that for a second:
58
214605
1884
想想吧,ISIS组织耗费了 这么多时间和精力,
03:36
ISIS took the time and made the effort
59
216513
2308
做出种种行为
03:38
to ensure their message is reaching the deaf and hard of hearing.
60
218845
3804
以确保他们的信息 能够传达给残障人士。
03:45
It's actually not tech-savviness
61
225143
2144
其实ISIS组织赢得人心与信赖,
03:47
that is the reason why ISIS wins hearts and minds.
62
227311
2595
并非因为他们精通科技。
03:49
It's their insight into the prejudices, the vulnerabilities, the desires
63
229930
4163
事实上,他们洞悉力很强, 深知他们试图接触的人
所拥有的偏见,脆弱和欲望,
03:54
of the people they're trying to reach
64
234117
1774
03:55
that does that.
65
235915
1161
因此才能实现其目的。
03:57
That's why it's not enough
66
237718
1429
这就为什么
03:59
for the online platforms to focus on removing recruiting material.
67
239171
4239
对网络平台来说,把焦点放在 移除他们的招募材料上是远远不够的。
04:04
If we want to have a shot at building meaningful technology
68
244518
3581
如果我们想要尝试建立有意义的技术,
04:08
that's going to counter radicalization,
69
248123
1874
用以打击激进主义,
04:10
we have to start with the human journey at its core.
70
250021
2979
我们就必须从人心着手。
04:13
So we went to Iraq
71
253884
2187
因此,我们去了伊拉克,
04:16
to speak to young men who'd bought into ISIS's promise
72
256095
2831
和年轻人交谈,他们深信ISIS组织
关于英勇与正义的承诺,
04:18
of heroism and righteousness,
73
258950
3191
04:22
who'd taken up arms to fight for them
74
262165
1847
曾为ISIS组织斗争,
接着,他们在目睹ISIS组织
04:24
and then who'd defected
75
264036
1338
04:25
after they witnessed the brutality of ISIS's rule.
76
265398
3021
统治的残忍无情后,选择变节。
04:28
And I'm sitting there in this makeshift prison in the north of Iraq
77
268880
3192
在伊拉克北部的临时监狱中,
04:32
with this 23-year-old who had actually trained as a suicide bomber
78
272096
4550
我面见了一位年仅23岁, 在变节前受过自杀式炸弹训练的
04:36
before defecting.
79
276670
1552
年轻人。
04:39
And he says,
80
279080
1158
他说,
04:41
"I arrived in Syria full of hope,
81
281119
3220
“我满怀希望来到叙利亚,
04:44
and immediately, I had two of my prized possessions confiscated:
82
284363
4365
但马上,我被没收了 两项最重要的东西:
04:48
my passport and my mobile phone."
83
288752
2933
我的护照和手机。”
04:52
The symbols of his physical and digital liberty
84
292140
2406
从他到达叙利亚的那刻,象征他人身自由
04:54
were taken away from him on arrival.
85
294570
1760
和电子自由的东西,就被剥夺了。
04:57
And then this is the way he described that moment of loss to me.
86
297248
3510
接着,他向我描述了那个迷失的时刻。
05:01
He said,
87
301356
1586
他说,
05:02
"You know in 'Tom and Jerry,'
88
302966
2329
“在《猫和老鼠》中,
05:06
when Jerry wants to escape, and then Tom locks the door
89
306192
3103
当杰瑞想要逃脱时,汤姆总是锁上门,
05:09
and swallows the key
90
309319
1156
并吞下钥匙,
05:10
and you see it bulging out of his throat as it travels down?"
91
310499
3551
还可以从外观看出钥匙 沿着汤姆的喉咙下滑,记得吗?
05:14
And of course, I really could see the image that he was describing,
92
314446
3153
当然,我完全想象得到 他所描绘的那种画面,
05:17
and I really did connect with the feeling that he was trying to convey,
93
317623
3661
也完全能体会到他想要传达的感受,
05:21
which was one of doom,
94
321308
2021
一种在劫难逃的感觉,
05:23
when you know there's no way out.
95
323353
1789
当你知道无路可走了。
05:26
And I was wondering:
96
326551
1289
我很纳闷:
05:28
What, if anything, could have changed his mind
97
328644
2682
在他离开家的那天,是否有任何东西
05:31
the day that he left home?
98
331350
1240
有可能改变他的想法?
05:32
So I asked,
99
332614
1250
所以我问:
05:33
"If you knew everything that you know now
100
333888
3178
“如果你当时知晓现在面临的苦难,
05:37
about the suffering and the corruption, the brutality --
101
337090
3051
腐败和残酷的状况——
05:40
that day you left home,
102
340165
1415
在你离开家的那天就知道,
05:41
would you still have gone?"
103
341604
1679
你还会选择离开吗?“
05:43
And he said, "Yes."
104
343786
1711
他说:“会的。“
05:45
And I thought, "Holy crap, he said 'Yes.'"
105
345846
2282
我心想:“我的天啊,他尽然说‘会’。”
05:48
And then he said,
106
348694
1219
接着他说:
05:49
"At that point, I was so brainwashed,
107
349937
3001
“我那时被完全洗脑了,
05:52
I wasn't taking in any contradictory information.
108
352962
3244
没有接受任何与之矛盾的信息。
05:56
I couldn't have been swayed."
109
356744
1555
没人能动摇我的决心。“
05:59
"Well, what if you knew everything that you know now
110
359235
2527
“好吧,如果在你离开的六个月前,
06:01
six months before the day that you left?"
111
361786
2098
就已知道现在发生的事情呢?”
06:05
"At that point, I think it probably would have changed my mind."
112
365345
3131
“我想那时我可能会改变心意。”
06:10
Radicalization isn't this yes-or-no choice.
113
370138
3397
极端激进并不是一道选择题。
06:14
It's a process, during which people have questions --
114
374007
2977
它是人充满疑惑时的一个过程,
06:17
about ideology, religion, the living conditions.
115
377008
3776
关于思想意识,宗教信仰,和生活条件。
06:20
And they're coming online for answers,
116
380808
2766
人们会在网上寻求答案,
06:23
which is an opportunity to reach them.
117
383598
1917
这就为我们提供了接触他们的机会。
06:25
And there are videos online from people who have answers --
118
385905
3014
有答案的人在网上提供影片——
06:28
defectors, for example, telling the story of their journey
119
388943
2876
比如,叛逃者讲述他们 投入和摆脱暴力的
06:31
into and out of violence;
120
391843
1583
心路历程;
06:33
stories like the one from that man I met in the Iraqi prison.
121
393450
3487
就像伊拉克监狱那名男子的故事。
06:37
There are locals who've uploaded cell phone footage
122
397914
2590
有当地人会上传手机影片,
06:40
of what life is really like in the caliphate under ISIS's rule.
123
400528
3503
呈现在ISIS组织统治下, 穆斯林的生活状态。
06:44
There are clerics who are sharing peaceful interpretations of Islam.
124
404055
3735
有教会圣职人士分享关于 伊斯兰教的和平诠释。
06:48
But you know what?
125
408830
1150
但你们知道吗?
这些人通常并不具备ISIS组织 那样高超的营销本领。
06:50
These people don't generally have the marketing prowess of ISIS.
126
410004
3020
06:54
They risk their lives to speak up and confront terrorist propaganda,
127
414049
4532
他们冒着生命危险, 与恐怖主义宣传对质,
06:58
and then they tragically don't reach the people
128
418605
2211
不幸的是,他们无法接触到
07:00
who most need to hear from them.
129
420840
1682
最需要听到他们声音的人。
07:03
And we wanted to see if technology could change that.
130
423173
2612
我们想试试看, 科技是否能够改变这一点。
07:06
So in 2016, we partnered with Moonshot CVE
131
426205
4183
所以在2016年,我们与 Moonshot CVE合作,
07:10
to pilot a new approach to countering radicalization
132
430412
3180
试着采用新方案打击激进主义,
07:13
called the "Redirect Method."
133
433616
1780
该方法名叫“重新定向法。”
07:16
It uses the power of online advertising
134
436453
3012
它利用线上广告的力量,
07:19
to bridge the gap between those susceptible to ISIS's messaging
135
439489
4514
在摇摆不的人们和 ISIS组织的信息之间搭建桥梁,
那些可靠的声音揭露了 大量信息的真面目。
07:24
and those credible voices that are debunking that messaging.
136
444027
3760
07:28
And it works like this:
137
448633
1150
它是这样运作的:
07:29
someone looking for extremist material --
138
449807
1961
当有人在寻找极端主义的材料——
07:31
say they search for "How do I join ISIS?" --
139
451792
2990
比如,搜索 “我如何 能加入ISIS组织?”——
07:34
will see an ad appear
140
454806
2476
就会看见一则广告出现,
07:37
that invites them to watch a YouTube video of a cleric, of a defector --
141
457306
4882
邀请他们观看圣职人士和变节人员 在Youtube网站上的视频——
07:42
someone who has an authentic answer.
142
462212
2310
那些拥有真实答案的人的心声。
07:44
And that targeting is based not on a profile of who they are,
143
464546
3623
这个方法锁定目标的方式, 不是依据他们的个人资料,
07:48
but of determining something that's directly relevant
144
468193
3053
而是由与询问和质疑
07:51
to their query or question.
145
471270
1708
直接相关的经历来决定。
07:54
During our eight-week pilot in English and Arabic,
146
474122
2842
在用英语和阿拉伯语 进行的八周试验中,
07:56
we reached over 300,000 people
147
476988
3279
我们接触了超过30万
08:00
who had expressed an interest in or sympathy towards a jihadi group.
148
480291
5545
对圣战组织表现出兴趣或者同情的人。
08:06
These people were now watching videos
149
486626
2264
这些人现在观看的影片,
08:08
that could prevent them from making devastating choices.
150
488914
3340
能预防他们做出毁灭性的选择。
08:13
And because violent extremism isn't confined to any one language,
151
493405
3727
因为暴力极端主义 并不局限于任何一种语言,
08:17
religion or ideology,
152
497156
1804
宗教和思想,
08:18
the Redirect Method is now being deployed globally
153
498984
3501
“重新定向法”现已在全球实施,
08:22
to protect people being courted online by violent ideologues,
154
502509
3804
保护大家上网时不被 暴力的意识形态诱惑,
08:26
whether they're Islamists, white supremacists
155
506337
2596
不论是伊斯兰教,白人至上主义
08:28
or other violent extremists,
156
508957
2103
或者其他暴力极端主义,
08:31
with the goal of giving them the chance to hear from someone
157
511084
2873
我们的目标是给他们机会,去聆听那些
08:33
on the other side of that journey;
158
513981
2091
旅程另一端的人怎么说;
08:36
to give them the chance to choose a different path.
159
516096
2839
帮助他们去选择不一样的人生道路。
08:40
It turns out that often the bad guys are good at exploiting the internet,
160
520749
5980
结果表明,恶人通常擅长利用网络,
08:46
not because they're some kind of technological geniuses,
161
526753
3744
并不是因为他们是科技天才,
08:50
but because they understand what makes people tick.
162
530521
2985
而是因为他们理解什么能引人注意。
08:54
I want to give you a second example:
163
534855
2369
我再举个例子:
08:58
online harassment.
164
538019
1391
网络骚扰。
09:00
Online harassers also work to figure out what will resonate
165
540629
3363
线网络骚扰者也致力于发掘能够
与他人引起共鸣的事物。
09:04
with another human being.
166
544016
1615
09:05
But not to recruit them like ISIS does,
167
545655
3110
但他们的目的不是 像ISIS组织一样招募人,
09:08
but to cause them pain.
168
548789
1275
而是给他人带来痛苦。
09:11
Imagine this:
169
551259
1342
想象一下这个状况:
09:13
you're a woman,
170
553347
1659
你是个女人,
09:15
you're married,
171
555030
1413
已婚,
09:16
you have a kid.
172
556467
1154
有一个孩子。
09:18
You post something on social media,
173
558834
1784
你在社交媒体上发表了一篇文章,
09:20
and in a reply, you're told that you'll be raped,
174
560642
2886
在评论中,有人说你会被强暴,
09:24
that your son will be watching,
175
564577
1560
你的儿子
09:26
details of when and where.
176
566825
1856
会在何时何地被监视。
09:29
In fact, your home address is put online for everyone to see.
177
569148
3143
事实上,所有人都能在网上 搜索到你的家庭地址。
09:33
That feels like a pretty real threat.
178
573580
2007
那种威胁的感觉十分真实。
09:37
Do you think you'd go home?
179
577113
1656
你觉得你还会回家吗?
09:39
Do you think you'd continue doing the thing that you were doing?
180
579999
3048
你觉得你会对此无动于衷吗?
09:43
Would you continue doing that thing that's irritating your attacker?
181
583071
3220
你会继续做惹恼了 攻击你的人的那件事吗?
09:48
Online abuse has been this perverse art
182
588016
3096
网络欺凌已经变成了 一种变态的艺术,
09:51
of figuring out what makes people angry,
183
591136
3468
发掘让人愤怒,
09:54
what makes people afraid,
184
594628
2132
让人害怕,
09:56
what makes people insecure,
185
596784
1641
让人们失去安全感的事物,
09:58
and then pushing those pressure points until they're silenced.
186
598449
3067
然后给他们施压, 直至他们永远保持沉默。
10:02
When online harassment goes unchecked,
187
602333
2304
当线上骚扰者肆意妄为时,
10:04
free speech is stifled.
188
604661
1667
言论自由就会被扼杀。
10:07
And even the people hosting the conversation
189
607196
2127
即使是主持对话的人,
10:09
throw up their arms and call it quits,
190
609347
1834
也举手投降并宣布到此为止,
10:11
closing their comment sections and their forums altogether.
191
611205
2957
把他们的留言区以及论坛全部关闭。
10:14
That means we're actually losing spaces online
192
614186
2849
这意味着我们其实失去了在网上
10:17
to meet and exchange ideas.
193
617059
1987
让思想碰撞与交流的空间。
10:19
And where online spaces remain,
194
619939
2163
在网络剩下的空间里,
10:22
we descend into echo chambers with people who think just like us.
195
622126
4470
我们被困在了“回音室”,只和 想法一致的人聚在一起。
10:27
But that enables the spread of disinformation;
196
627688
2499
但那会导致虚假信息的传播;
10:30
that facilitates polarization.
197
630211
2184
这会更加促成两极分化。
10:34
What if technology instead could enable empathy at scale?
198
634508
5269
反之,如果能用科技产生大量共鸣呢?
10:40
This was the question that motivated our partnership
199
640451
2486
这驱使我们
10:42
with Google's Counter Abuse team,
200
642961
1819
与谷歌反虐待小组,
10:44
Wikipedia
201
644804
1178
维基百科,
和《纽约时报》这类报刊合作。
10:46
and newspapers like the New York Times.
202
646006
1934
10:47
We wanted to see if we could build machine-learning models
203
647964
2876
我们想要看看是否可以获悉
10:50
that could understand the emotional impact of language.
204
650864
3606
机器学习的模型来理解 语言所造成的情感影响。
10:55
Could we predict which comments were likely to make someone else leave
205
655062
3610
我们能否推测一下, 什么样的评论最可能让人
10:58
the online conversation?
206
658696
1374
逃离网络聊天?
11:00
And that's no mean feat.
207
660515
3887
这个问题很难回答。
11:04
That's no trivial accomplishment
208
664426
1566
对人工智能来说,
能完成这些并非理所当然。
11:06
for AI to be able to do something like that.
209
666016
2563
11:08
I mean, just consider these two examples of messages
210
668603
3729
想想这两个例子,
我上周可能收到的信息。
11:12
that could have been sent to me last week.
211
672356
2224
11:15
"Break a leg at TED!"
212
675517
1879
“祝TED演讲顺利!”
11:17
... and
213
677420
1164
以及
11:18
"I'll break your legs at TED."
214
678608
2126
“我会在TED现场打断你的腿。”
11:20
(Laughter)
215
680758
1246
(笑声)
你们是人,
11:22
You are human,
216
682028
1513
11:23
that's why that's an obvious difference to you,
217
683565
2210
所以能够明显感受到 两句话意思的不同,
11:25
even though the words are pretty much the same.
218
685799
2224
即使字句基本一致。
但对于人工智能来说, 要对模型进行一定的训练,
11:28
But for AI, it takes some training to teach the models
219
688047
3079
11:31
to recognize that difference.
220
691150
1571
才能辨析那些不同之处。
11:32
The beauty of building AI that can tell the difference
221
692745
3245
创建能分辨差异的人工智能, 其美妙之处在于,
11:36
is that AI can then scale to the size of the online toxicity phenomenon,
222
696014
5050
人工智能可以大规模 防治网络毒害现象,
11:41
and that was our goal in building our technology called Perspective.
223
701088
3287
为了实现这一目标, 我们打造了名叫“观点”的技术。
11:45
With the help of Perspective,
224
705056
1427
在“观点”的帮助下,
11:46
the New York Times, for example,
225
706507
1583
以《纽约时报》为例,
他们增加了线上讨论的空间。
11:48
has increased spaces online for conversation.
226
708114
2487
11:51
Before our collaboration,
227
711005
1310
在与我们合作前,
11:52
they only had comments enabled on just 10 percent of their articles.
228
712339
4305
他们的文章只有约10%的开放评论。
11:57
With the help of machine learning,
229
717495
1644
在机器学习的协助下,
11:59
they have that number up to 30 percent.
230
719163
1897
这个数字增加到了30%。
12:01
So they've tripled it,
231
721084
1156
涨了两倍,
12:02
and we're still just getting started.
232
722264
1917
这项合作才刚刚开始而已。
12:04
But this is about way more than just making moderators more efficient.
233
724872
3461
但这绝不仅仅是让版主更有效率而已。
12:10
Right now I can see you,
234
730076
1850
现在,我可以看见你们,
12:11
and I can gauge how what I'm saying is landing with you.
235
731950
3294
我可以估量我的用词 将对你们产生怎样的影响。
12:16
You don't have that opportunity online.
236
736370
1879
线上并没有这样的机会。
12:18
Imagine if machine learning could give commenters,
237
738558
3635
想象一下,在网友打字的时候,
12:22
as they're typing,
238
742217
1162
如果机器学习能够
12:23
real-time feedback about how their words might land,
239
743403
3347
对他们的话语可能造成的影响 提供即时的反馈,会如何。
12:27
just like facial expressions do in a face-to-face conversation.
240
747609
3024
就像面对面交谈时的脸部表情。
12:32
Machine learning isn't perfect,
241
752926
1842
机器学习并不完美,
12:34
and it still makes plenty of mistakes.
242
754792
2394
它仍然会犯许多的错误。
12:37
But if we can build technology
243
757210
1557
但如果我们能够打造
12:38
that understands the emotional impact of language,
244
758791
3293
理解语言中的情感影响力的技术,
我们就能建立更多的同理心。
12:42
we can build empathy.
245
762108
1460
12:43
That means that we can have dialogue between people
246
763592
2425
那就意味着,我们可以让两人进行对话,
即使他们所属的政治体系不同,
12:46
with different politics,
247
766041
1816
12:47
different worldviews,
248
767881
1216
世界观不同,
12:49
different values.
249
769121
1246
价值观不同。
12:51
And we can reinvigorate the spaces online that most of us have given up on.
250
771359
4775
我们可以让大多数人已经 放弃的网络空间得到复苏。
12:57
When people use technology to exploit and harm others,
251
777857
3785
当人们利用科技剥削和伤害他人时,
13:01
they're preying on our human fears and vulnerabilities.
252
781666
3642
他们靠着人类的恐惧与脆弱,肆意掠夺。
13:06
If we ever thought that we could build an internet
253
786461
3508
如果我们想当然,认为能够建立一个
13:09
insulated from the dark side of humanity,
254
789993
2578
和人性阴暗面绝缘的网络,
13:12
we were wrong.
255
792595
1184
那就大错特错了。
13:14
If we want today to build technology
256
794361
2270
现如今,如果我们想要发展
13:16
that can overcome the challenges that we face,
257
796655
3127
能够克服当前所面临的挑战的科技,
13:19
we have to throw our entire selves into understanding the issues
258
799806
4043
就必须全身心投入,深刻理解问题,
13:23
and into building solutions
259
803873
1893
进而找到解决方法,
13:25
that are as human as the problems they aim to solve.
260
805790
3782
来克服人类自身的种种问题。
13:30
Let's make that happen.
261
810071
1513
让我们一起实现它吧。
13:31
Thank you.
262
811924
1150
谢谢大家!
13:33
(Applause)
263
813098
3277
(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7