请双击下面的英文字幕来播放视频。
翻译人员: Thomas Tam
校对人员: Echo Sun
00:13
I work on helping computers
communicate about the world around us.
0
13381
4015
我致力于协助电脑和
我们周围世界的沟通。
00:17
There are a lot of ways to do this,
1
17754
1793
是有很多方法可以做到这一点,
00:19
and I like to focus on helping computers
2
19571
2592
我喜欢专注于协助电脑
00:22
to talk about what they see
and understand.
3
22187
2874
去谈论它们看到和理解的内容。
00:25
Given a scene like this,
4
25514
1571
鉴于这样的情景,
00:27
a modern computer-vision algorithm
5
27109
1905
一个现代的计算机视觉演算法,
可以告诉你,有一个女人,
还有一只狗。
00:29
can tell you that there's a woman
and there's a dog.
6
29038
3095
00:32
It can tell you that the woman is smiling.
7
32157
2706
它可以告诉你,那个女人在微笑。
00:34
It might even be able to tell you
that the dog is incredibly cute.
8
34887
3873
它甚至可以告诉你,
这只狗非常可爱。
00:38
I work on this problem
9
38784
1349
我处理这个问题
00:40
thinking about how humans
understand and process the world.
10
40157
4212
思考人类如何理解和与世界共处。
00:45
The thoughts, memories and stories
11
45577
2952
那些思想,记忆和故事
00:48
that a scene like this
might evoke for humans.
12
48553
2818
在这样的场景中,
可能会唤起人类的注意。
00:51
All the interconnections
of related situations.
13
51395
4285
所有关连情况的相互联系。
00:55
Maybe you've seen
a dog like this one before,
14
55704
3126
也许你以前见过这样的狗,
00:58
or you've spent time
running on a beach like this one,
15
58854
2969
或者你曾经花时间,
在这样的沙滩上跑步,
01:01
and that further evokes thoughts
and memories of a past vacation,
16
61847
4778
并进一步唤起
过去假期的记忆和想法,
01:06
past times to the beach,
17
66649
1920
以前去海滩的时候,
01:08
times spent running around
with other dogs.
18
68593
2603
花在与其他狗儿,
跑来跑去的时间。
01:11
One of my guiding principles
is that by helping computers to understand
19
71688
5207
我的指导原则之一,
是通过帮助电脑了解
01:16
what it's like to have these experiences,
20
76919
2896
这是什么样的经历,
01:19
to understand what we share
and believe and feel,
21
79839
5176
从而了解我们所相信的
和感受的共通点,
01:26
then we're in a great position
to start evolving computer technology
22
86094
4310
那么我们就有能力开始
不断发展计算机技术,
01:30
in a way that's complementary
with our own experiences.
23
90428
4587
以一种与我们经验互补的方式。
01:35
So, digging more deeply into this,
24
95539
3387
因此,深入挖掘这一点,
01:38
a few years ago I began working on helping
computers to generate human-like stories
25
98950
5905
我几年前开始致力于帮助电脑
产生类似人类的故事,
01:44
from sequences of images.
26
104879
1666
从图像序列。
01:47
So, one day,
27
107427
1904
所以,有一天,
01:49
I was working with my computer to ask it
what it thought about a trip to Australia.
28
109355
4622
我正在用电脑工作时,询问它
对澳大利亚之行的看法。
01:54
It took a look at the pictures,
and it saw a koala.
29
114768
2920
它看了看图片,
看到一只树袋熊。
01:58
It didn't know what the koala was,
30
118236
1643
它不知道树袋熊是什么,
01:59
but it said it thought
it was an interesting-looking creature.
31
119903
2999
但电脑表示它认为树袋熊
看起来是很有趣的生物。
02:04
Then I shared with it a sequence of images
about a house burning down.
32
124053
4004
然后我与电脑分享一系列
关于房屋烧毁的图像。
02:09
It took a look at the images and it said,
33
129704
3285
电脑看了一下图片,它说:
02:13
"This is an amazing view!
This is spectacular!"
34
133013
3500
「这是个惊人的景观!
这很壮观!」
02:17
It sent chills down my spine.
35
137450
2095
它使我的脊背发冷。
02:20
It saw a horrible, life-changing
and life-destroying event
36
140983
4572
电脑看到一个可怕的,
改变生活和毁灭生命的事件
02:25
and thought it was something positive.
37
145579
2382
并认为这是积极的事情。
02:27
I realized that it recognized
the contrast,
38
147985
3441
我意识到电脑认识到
红色和黄色的对比,
02:31
the reds, the yellows,
39
151450
2699
02:34
and thought it was something
worth remarking on positively.
40
154173
3078
并认为这是值得积极评价的事情。
02:37
And part of why it was doing this
41
157928
1615
部分原因是因为
02:39
was because most
of the images I had given it
42
159577
2945
我输入电脑的大部分
02:42
were positive images.
43
162546
1840
是积极的图像。
02:44
That's because people
tend to share positive images
44
164903
3658
那是因为人们谈论自己的经历时,
倾向于分享积极的图像。
02:48
when they talk about their experiences.
45
168585
2190
02:51
When was the last time
you saw a selfie at a funeral?
46
171267
2541
你上次在葬礼上看到自拍照
是什么时候?
02:55
I realized that,
as I worked on improving AI
47
175434
3095
我意识到在改进人工智能的过程中,
02:58
task by task, dataset by dataset,
48
178553
3714
是任务归任务,数据集归数据集,
03:02
that I was creating massive gaps,
49
182291
2897
在电脑的理解上,
创造出巨大的差距,缺陷和盲点。
03:05
holes and blind spots
in what it could understand.
50
185212
3999
03:10
And while doing so,
51
190307
1334
在这样做的同时,
03:11
I was encoding all kinds of biases.
52
191665
2483
我正在为各种偏见编码。
03:15
Biases that reflect a limited viewpoint,
53
195029
3318
偏见反映有限观点,
03:18
limited to a single dataset --
54
198371
2261
源于一个数据集——
03:21
biases that can reflect
human biases found in the data,
55
201283
3858
其中就反映着人类同样的,
03:25
such as prejudice and stereotyping.
56
205165
3104
比如成见和刻板印象。
03:29
I thought back to the evolution
of the technology
57
209554
3057
我回想起那技术的发展,
03:32
that brought me to where I was that day --
58
212635
2502
让我达到那天我所处的境地——
03:35
how the first color images
59
215966
2233
第一张彩色的图像
03:38
were calibrated against
a white woman's skin,
60
218223
3048
是针对一位白人女性的
皮肤颜色进行校准的,
03:41
meaning that color photography
was biased against black faces.
61
221665
4145
这意味着彩色摄影对黑脸有偏差。
03:46
And that same bias, that same blind spot
62
226514
2925
而同样的偏差,那个盲点
03:49
continued well into the '90s.
63
229463
1867
继续带入了 90 年代。
03:51
And the same blind spot
continues even today
64
231701
3154
同样的盲点即使在今天,
03:54
in how well we can recognize
different people's faces
65
234879
3698
仍然存在于在面部识别技术应用中,
怎样辨识不同人物的脸。
03:58
in facial recognition technology.
66
238601
2200
04:01
I though about the state of the art
in research today,
67
241323
3143
在今天的研究中,
我想到了最先进的技术,
04:04
where we tend to limit our thinking
to one dataset and one problem.
68
244490
4514
都倾向于将我们的想法,
限制在一个数据集和一个问题上。
04:09
And that in doing so, we were creating
more blind spots and biases
69
249688
4881
而这样做,我们正在创造
更多的盲点和偏见,
04:14
that the AI could further amplify.
70
254593
2277
会在使用人工智能时
被进一步放大。
04:17
I realized then
that we had to think deeply
71
257712
2079
那时我意识到我们必须深思,
04:19
about how the technology we work on today
looks in five years, in 10 years.
72
259815
5519
我们今天发明创造技术,
在五年到十年之后会怎样被看待 。
04:25
Humans evolve slowly,
with time to correct for issues
73
265990
3142
在人类与环境互动作用中,
人类用时间纠正问题,
所以进化缓慢。
04:29
in the interaction of humans
and their environment.
74
269156
3534
04:33
In contrast, artificial intelligence
is evolving at an incredibly fast rate.
75
273276
5429
人工智能相比之下,正在以
令人难以置信的速度发展。
这意味着它确实很重要,
04:39
And that means that it really matters
76
279013
1773
04:40
that we think about this
carefully right now --
77
280810
2317
我们现在要仔细考虑这一点 ——
04:44
that we reflect on our own blind spots,
78
284180
3008
反思自己的盲点,
04:47
our own biases,
79
287212
2317
及偏见,
04:49
and think about how that's informing
the technology we're creating
80
289553
3857
并考虑这些偏见是如何影响
我们现在创造的技术,
04:53
and discuss what the technology of today
will mean for tomorrow.
81
293434
3902
并讨论今天的技术,
对未来意味着什么。
04:58
CEOs and scientists have weighed in
on what they think
82
298593
3191
CEO和科学家,
已经权衡了他们的想法,
05:01
the artificial intelligence technology
of the future will be.
83
301808
3325
关于未来的人工智能发展。
05:05
Stephen Hawking warns that
84
305157
1618
斯蒂芬·霍金警告说:
05:06
"Artificial intelligence
could end mankind."
85
306799
3007
「人工智能会使人类灭亡。」
05:10
Elon Musk warns
that it's an existential risk
86
310307
2683
伊隆‧马斯克警告
这是一种存在的风险,
05:13
and one of the greatest risks
that we face as a civilization.
87
313014
3574
也是我们作为文明社会;
要面临的最大风险之一。
05:17
Bill Gates has made the point,
88
317665
1452
比尔‧盖茨指出:
05:19
"I don't understand
why people aren't more concerned."
89
319141
3185
「我不明白为什么人们
对人工智能不更忧虑。」
05:23
But these views --
90
323412
1318
但是这些观点——
05:25
they're part of the story.
91
325618
1734
只是故事的一部分。
05:28
The math, the models,
92
328079
2420
那数学,模型,
05:30
the basic building blocks
of artificial intelligence
93
330523
3070
这些人工智能的基本组成部分,
05:33
are something that we call access
and all work with.
94
333617
3135
是我们都可以取得和并使用的。
05:36
We have open-source tools
for machine learning and intelligence
95
336776
3785
我们有向大众开放的源代码工具
来学习机器,
05:40
that we can contribute to.
96
340585
1734
并同时作出自己的贡献。
05:42
And beyond that,
we can share our experience.
97
342919
3340
除此之外,我们也可以
分享我们的经验。
05:46
We can share our experiences
with technology and how it concerns us
98
346760
3468
我们可以分享在技术方面
及其与我们的关系,
05:50
and how it excites us.
99
350252
1467
和如何令我们雀跃的地方。
05:52
We can discuss what we love.
100
352251
1867
我们可以讨论我们所爱的东西。
我们可以与预见的将来进行沟通,
05:55
We can communicate with foresight
101
355244
2031
05:57
about the aspects of technology
that could be more beneficial
102
357299
4857
关于技术方面这可能会更有益,
06:02
or could be more problematic over time.
103
362180
2600
或随着时间的推移,
可能会出现更多的问题。
06:05
If we all focus on opening up
the discussion on AI
104
365799
4143
如果我们都专注于
开放对于人工智能的讨论
06:09
with foresight towards the future,
105
369966
1809
展望未来,
06:13
this will help create a general
conversation and awareness
106
373093
4270
这将有助于创造一个
常规的对话和意识,
06:17
about what AI is now,
107
377387
2513
关于人工智能是什么?
06:21
what it can become
108
381212
2001
它能成为什么?
06:23
and all the things that we need to do
109
383237
1785
以及我们需要做的所有事情,
06:25
in order to enable that outcome
that best suits us.
110
385046
3753
以实现最适合我们的结果。
06:29
We already see and know this
in the technology that we use today.
111
389490
3674
我们已经在今天使用的技术中
看到和了解这一点。
06:33
We use smart phones
and digital assistants and Roombas.
112
393767
3880
我们使用智能手机,数码助理
和自动吸尘器。
06:38
Are they evil?
113
398457
1150
它们邪恶吗?
06:40
Maybe sometimes.
114
400268
1547
也许有时是。
06:42
Are they beneficial?
115
402664
1333
他们有益吗?
06:45
Yes, they're that, too.
116
405005
1533
是的,他们也是。
06:48
And they're not all the same.
117
408236
1761
它们并不完全相同。
06:50
And there you already see
a light shining on what the future holds.
118
410489
3540
在那里你已经看到了未来的光芒。
06:54
The future continues on
from what we build and create right now.
119
414942
3619
未来将继续从我们现在
建立和创造的东西开始。
06:59
We set into motion that domino effect
120
419165
2642
我们启动了多米诺骨牌效应,
07:01
that carves out AI's evolutionary path.
121
421831
2600
这就揭开了人工智能的进化通道
07:05
In our time right now,
we shape the AI of tomorrow.
122
425173
2871
在我们的时代,塑造了
明天的人工智能。
07:08
Technology that immerses us
in augmented realities
123
428566
3699
让我们能沉浸在增强现实的技术中,
07:12
bringing to life past worlds.
124
432289
2566
使过去的世界复活,
07:15
Technology that helps people
to share their experiences
125
435844
4312
当人们沟通有困难时,
07:20
when they have difficulty communicating.
126
440180
2262
科技就帮助他们分享彼此的经验。
07:23
Technology built on understanding
the streaming visual worlds
127
443323
4532
建立于在线视觉媒体的科技,
07:27
used as technology for self-driving cars.
128
447879
3079
可被用在汽车自动驾驶上。
07:32
Technology built on understanding images
and generating language,
129
452490
3413
科技基于图像的理解而产生语言,
07:35
evolving into technology that helps people
who are visually impaired
130
455927
4063
能演变成协助视障人士的技术,
07:40
be better able to access the visual world.
131
460014
2800
帮助他们更好地拥有视觉世界。
07:42
And we also see how technology
can lead to problems.
132
462838
3261
我们也看到科技
如何导致一些问题。
07:46
We have technology today
133
466885
1428
我们今天有科技
07:48
that analyzes physical
characteristics we're born with --
134
468337
3835
分析我们出生的身体特征 ——
07:52
such as the color of our skin
or the look of our face --
135
472196
3272
比如我们皮肤的颜色
还是我们脸上的表情
07:55
in order to determine whether or not
we might be criminals or terrorists.
136
475492
3804
以确定我们是否罪犯或恐怖分子。
07:59
We have technology
that crunches through our data,
137
479688
2905
我们拥有处理数据的技术,
08:02
even data relating
to our gender or our race,
138
482617
2896
处理关于性别或种族的数据,
08:05
in order to determine whether or not
we might get a loan.
139
485537
2865
以确定我们是否可以获得贷款。
08:09
All that we see now
140
489494
1579
我们现在看到的所有东西,
08:11
is a snapshot in the evolution
of artificial intelligence.
141
491097
3617
只是人工智能演变过程中的
快照。
08:15
Because where we are right now,
142
495763
1778
因为我们现在所处的地方,
08:17
is within a moment of that evolution.
143
497565
2238
是演变中的一个时刻。
08:20
That means that what we do now
will affect what happens down the line
144
500690
3802
这意味着我们现在所做的,
将会影响事情的往后发展,
08:24
and in the future.
145
504516
1200
并延至未来的世界。
08:26
If we want AI to evolve
in a way that helps humans,
146
506063
3951
如果我们希望人工智能
能协助人类的方式进化,
08:30
then we need to define
the goals and strategies
147
510038
2801
我们就需要确定策略和目标,
08:32
that enable that path now.
148
512863
1733
马上开通那条路径。
08:35
What I'd like to see is something
that fits well with humans,
149
515680
3738
我想看到的是适合人类的
08:39
with our culture and with the environment.
150
519442
2800
文化和环境的发展方向。
08:43
Technology that aids and assists
those of us with neurological conditions
151
523435
4484
科技能帮助我们治愈神经系统疾病
08:47
or other disabilities
152
527943
1721
或其它残疾的患者,
08:49
in order to make life
equally challenging for everyone.
153
529688
3216
让他们与每个人一样,
让生活同样具有挑战性。
08:54
Technology that works
154
534097
1421
科技的运作
不会考量你的特征
08:55
regardless of your demographics
or the color of your skin.
155
535542
3933
或皮肤颜色。
09:00
And so today, what I focus on
is the technology for tomorrow
156
540383
4742
我今天关注的是
明日和十年后的科技,
09:05
and for 10 years from now.
157
545149
1733
09:08
AI can turn out in many different ways.
158
548530
2634
人工智能可以以许多不同的方式出现。
09:11
But in this case,
159
551688
1225
但在这种情况下,
09:12
it isn't a self-driving car
without any destination.
160
552937
3328
它并不是没有任何目的地的
无人驾驶车。
09:16
This is the car that we are driving.
161
556884
2400
这是我们能驾驶同时控制的汽车。
09:19
We choose when to speed up
and when to slow down.
162
559953
3595
我们选择何时加速和何时减速。
09:23
We choose if we need to make a turn.
163
563572
2400
我们选择是否需要转弯。
09:26
We choose what the AI
of the future will be.
164
566868
3000
我们选择未来的人工智能会是什么。
会有一个广阔的竞技场。
09:31
There's a vast playing field
165
571186
1337
09:32
of all the things that artificial
intelligence can become.
166
572547
2965
容许人工智能可以成为所有的东西。
09:36
It will become many things.
167
576064
1800
它会变成很多不同的东西。
09:39
And it's up to us now,
168
579694
1732
现在取决于
09:41
in order to figure out
what we need to put in place
169
581450
3061
我们要弄清楚所需要实施的
09:44
to make sure the outcomes
of artificial intelligence
170
584535
3807
以确保人工智能的结果
09:48
are the ones that will be
better for all of us.
171
588366
3066
是对所有人类都会更好。
09:51
Thank you.
172
591456
1150
谢谢。
09:52
(Applause)
173
592630
2187
(掌声)
New videos
Original video on YouTube.com
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。