請雙擊下方英文字幕播放視頻。
譯者: Lilian Chiu
審譯者: NAN-KUN WU
00:13
I work on helping computers
communicate about the world around us.
0
13381
4015
我的工作是在協助電腦
和我們周遭的世界交流。
00:17
There are a lot of ways to do this,
1
17754
1793
要這麼做,有很多方式,
00:19
and I like to focus on helping computers
2
19571
2592
我喜歡聚焦在協助電腦
00:22
to talk about what they see
and understand.
3
22187
2874
來談它們看見什麼、了解什麼。
00:25
Given a scene like this,
4
25514
1571
給予一個這樣的情景,
00:27
a modern computer-vision algorithm
5
27109
1905
一個現代電腦視覺演算法
00:29
can tell you that there's a woman
and there's a dog.
6
29038
3095
就能告訴你情景中
有一個女子和一隻狗。
00:32
It can tell you that the woman is smiling.
7
32157
2706
它能告訴你,女子在微笑。
00:34
It might even be able to tell you
that the dog is incredibly cute.
8
34887
3873
它甚至可能可以告訴你,
那隻狗相當可愛。
00:38
I work on this problem
9
38784
1349
我處理這個問題時
00:40
thinking about how humans
understand and process the world.
10
40157
4212
腦中想的是人類如何
了解和處理這個世界。
00:45
The thoughts, memories and stories
11
45577
2952
對人類而言,這樣的情景有可能會
00:48
that a scene like this
might evoke for humans.
12
48553
2818
喚起什麼樣的思想、記憶、故事。
00:51
All the interconnections
of related situations.
13
51395
4285
相關情況的所有相互連結。
00:55
Maybe you've seen
a dog like this one before,
14
55704
3126
也許你以前看過像這樣的狗,
00:58
or you've spent time
running on a beach like this one,
15
58854
2969
或者你曾花時間在
像這樣的海灘上跑步,
01:01
and that further evokes thoughts
and memories of a past vacation,
16
61847
4778
那就會進一步喚起一個
過去假期的思想和記憶,
01:06
past times to the beach,
17
66649
1920
過去在海灘的時光,
01:08
times spent running around
with other dogs.
18
68593
2603
花在和其他狗兒到處奔跑的時間。
01:11
One of my guiding principles
is that by helping computers to understand
19
71688
5207
我的指導原則之一,
是要協助電腦了解
01:16
what it's like to have these experiences,
20
76919
2896
有這些經驗是什麼樣的感覺,
01:19
to understand what we share
and believe and feel,
21
79839
5176
去了解我們共有什麼、
相信什麼、感受到什麼,
01:26
then we're in a great position
to start evolving computer technology
22
86094
4310
那麼,我們就有很好的機會
可以開始讓電腦科技
01:30
in a way that's complementary
with our own experiences.
23
90428
4587
以一種和我們自身經驗
互補的方式來演化。
01:35
So, digging more deeply into this,
24
95539
3387
所以,我更深入研究這主題,
01:38
a few years ago I began working on helping
computers to generate human-like stories
25
98950
5905
幾年前,我開始努力協助
電腦產生出像人類一樣的故事,
01:44
from sequences of images.
26
104879
1666
從一連串的影像來產生。
01:47
So, one day,
27
107427
1904
有一天,
01:49
I was working with my computer to ask it
what it thought about a trip to Australia.
28
109355
4622
我在問我的電腦,它對於
去澳洲旅行有什麼想法。
01:54
It took a look at the pictures,
and it saw a koala.
29
114768
2920
它看了照片,看見一隻無尾熊。
01:58
It didn't know what the koala was,
30
118236
1643
它不知道無尾熊是什麼,
01:59
but it said it thought
it was an interesting-looking creature.
31
119903
2999
但它說它認為這是一隻
看起來很有趣的生物。
02:04
Then I shared with it a sequence of images
about a house burning down.
32
124053
4004
我和它分享了一系列的圖片,
都和房子被燒毀有關。
02:09
It took a look at the images and it said,
33
129704
3285
它看了那些圖片,說:
02:13
"This is an amazing view!
This is spectacular!"
34
133013
3500
「這是好棒的景色!這好壯觀!」
02:17
It sent chills down my spine.
35
137450
2095
它讓我背脊發涼。
02:20
It saw a horrible, life-changing
and life-destroying event
36
140983
4572
它看著一個會改變人生、
摧毀生命的可怕事件,
02:25
and thought it was something positive.
37
145579
2382
卻以為它是很正面的東西。
02:27
I realized that it recognized
the contrast,
38
147985
3441
我了解到,它能認出對比反差、
02:31
the reds, the yellows,
39
151450
2699
紅色、黃色,
02:34
and thought it was something
worth remarking on positively.
40
154173
3078
然後就認為它是
值得正面評論的東西。
02:37
And part of why it was doing this
41
157928
1615
它這麼做的部分原因,
02:39
was because most
of the images I had given it
42
159577
2945
是因為我給它的大多數圖片
02:42
were positive images.
43
162546
1840
都是正面的圖片。
02:44
That's because people
tend to share positive images
44
164903
3658
那是因為人在談論他們的經驗時,
02:48
when they talk about their experiences.
45
168585
2190
本來就傾向會分享正面的圖片。
02:51
When was the last time
you saw a selfie at a funeral?
46
171267
2541
你上次看到在葬禮上的
自拍照是何時?
02:55
I realized that,
as I worked on improving AI
47
175434
3095
我了解到,當我努力
在改善人工智慧,
02:58
task by task, dataset by dataset,
48
178553
3714
一個任務一個任務、一個資料集
一個資料集地改善,
03:02
that I was creating massive gaps,
49
182291
2897
結果我卻在「它能了解什麼」上
03:05
holes and blind spots
in what it could understand.
50
185212
3999
創造出了大量的隔閡、
漏洞,以及盲點。
03:10
And while doing so,
51
190307
1334
這麼做的時候,
03:11
I was encoding all kinds of biases.
52
191665
2483
我是在把各種偏見做編碼。
03:15
Biases that reflect a limited viewpoint,
53
195029
3318
這些偏見反映出受限的觀點,
03:18
limited to a single dataset --
54
198371
2261
受限於單一資料集──
03:21
biases that can reflect
human biases found in the data,
55
201283
3858
這些偏見能反應出
在資料中的人類偏見,
03:25
such as prejudice and stereotyping.
56
205165
3104
比如偏袒以及刻板印象。
03:29
I thought back to the evolution
of the technology
57
209554
3057
我回頭去想一路帶我走到
03:32
that brought me to where I was that day --
58
212635
2502
那個時點的科技演化──
03:35
how the first color images
59
215966
2233
第一批彩色影像如何
03:38
were calibrated against
a white woman's skin,
60
218223
3048
根據一個白種女子的皮膚來做校準,
03:41
meaning that color photography
was biased against black faces.
61
221665
4145
這表示,彩色照片對於
黑皮膚臉孔是有偏見的。
03:46
And that same bias, that same blind spot
62
226514
2925
同樣的偏見,同樣的盲點,
03:49
continued well into the '90s.
63
229463
1867
持續湧入了九○年代。
03:51
And the same blind spot
continues even today
64
231701
3154
而同樣的盲點甚至持續到現今,
03:54
in how well we can recognize
different people's faces
65
234879
3698
出現在我們對於不同人的
臉部辨識能力中,
03:58
in facial recognition technology.
66
238601
2200
在人臉辨識技術中。
04:01
I though about the state of the art
in research today,
67
241323
3143
我思考了現今在研究上發展水平,
04:04
where we tend to limit our thinking
to one dataset and one problem.
68
244490
4514
我們傾向會把我們的思路限制
在一個資料集或一個問題上。
04:09
And that in doing so, we were creating
more blind spots and biases
69
249688
4881
這麼做時,我們就會
創造出更多盲點和偏見,
04:14
that the AI could further amplify.
70
254593
2277
它們可能會被人工智慧給放大。
04:17
I realized then
that we had to think deeply
71
257712
2079
那時,我了解到,我們必須要
04:19
about how the technology we work on today
looks in five years, in 10 years.
72
259815
5519
深入思考我們現今努力發展的科技,
在五年、十年後會是什麼樣子。
04:25
Humans evolve slowly,
with time to correct for issues
73
265990
3142
人類演進很慢,有時間可以去修正
04:29
in the interaction of humans
and their environment.
74
269156
3534
在人類互動以及其環境中的議題。
04:33
In contrast, artificial intelligence
is evolving at an incredibly fast rate.
75
273276
5429
相對的,人工智慧的
演進速度非常快。
04:39
And that means that it really matters
76
279013
1773
那就意味著,很重要的是
04:40
that we think about this
carefully right now --
77
280810
2317
我們現在要如何仔細思考這件事──
04:44
that we reflect on our own blind spots,
78
284180
3008
我們要反省我們自己的盲點,
04:47
our own biases,
79
287212
2317
我們自己的偏見,
04:49
and think about how that's informing
the technology we're creating
80
289553
3857
並想想它們帶給我們所
創造出的科技什麼樣的資訊,
04:53
and discuss what the technology of today
will mean for tomorrow.
81
293434
3902
並討論現今的科技在
將來代表的是什麼意涵。
04:58
CEOs and scientists have weighed in
on what they think
82
298593
3191
對於未來的人工智慧
應該是什麼樣子,
05:01
the artificial intelligence technology
of the future will be.
83
301808
3325
執行長和科學家的意見
是很有份量的。
05:05
Stephen Hawking warns that
84
305157
1618
史蒂芬霍金警告過:
05:06
"Artificial intelligence
could end mankind."
85
306799
3007
「人工智慧可能終結人類。」
05:10
Elon Musk warns
that it's an existential risk
86
310307
2683
伊隆馬斯克警告過,
它是個生存風險,
05:13
and one of the greatest risks
that we face as a civilization.
87
313014
3574
也是我們人類文明所
面臨最大的風險之一。
05:17
Bill Gates has made the point,
88
317665
1452
比爾蓋茲有個論點:
05:19
"I don't understand
why people aren't more concerned."
89
319141
3185
「我不了解為什麼
人們不更關心一點。」
05:23
But these views --
90
323412
1318
但,這些看法──
05:25
they're part of the story.
91
325618
1734
它們是故事的一部分。
05:28
The math, the models,
92
328079
2420
數學、模型,
05:30
the basic building blocks
of artificial intelligence
93
330523
3070
人工智慧的基礎材料
05:33
are something that we call access
and all work with.
94
333617
3135
是我們所有人都能夠取得並使用的。
05:36
We have open-source tools
for machine learning and intelligence
95
336776
3785
我們有機器學習和智慧用的
開放原始碼工具,
05:40
that we can contribute to.
96
340585
1734
我們都能對其做出貢獻。
05:42
And beyond that,
we can share our experience.
97
342919
3340
在那之外,我們可以
分享我們的經驗。
05:46
We can share our experiences
with technology and how it concerns us
98
346760
3468
分享關於科技、它如何影響我們、
05:50
and how it excites us.
99
350252
1467
它如何讓我們興奮的經驗。
05:52
We can discuss what we love.
100
352251
1867
我們可以討論我們所愛的。
05:55
We can communicate with foresight
101
355244
2031
我們能帶著遠見來交流,
05:57
about the aspects of technology
that could be more beneficial
102
357299
4857
談談關於科技有哪些面向,
隨著時間發展可能可以更有助益,
或可能產生問題。
06:02
or could be more problematic over time.
103
362180
2600
06:05
If we all focus on opening up
the discussion on AI
104
365799
4143
若我們都能把焦點放在
開放地帶著對未來的遠見
來討論人工智慧,
06:09
with foresight towards the future,
105
369966
1809
06:13
this will help create a general
conversation and awareness
106
373093
4270
這就能創造出一般性的談話和意識,
06:17
about what AI is now,
107
377387
2513
關於人工智慧現在是什麼樣子、
06:21
what it can become
108
381212
2001
它未來可以變成什麼樣子,
06:23
and all the things that we need to do
109
383237
1785
以及所有我們需要做的事,
06:25
in order to enable that outcome
that best suits us.
110
385046
3753
以產生出最適合我們的結果。
06:29
We already see and know this
in the technology that we use today.
111
389490
3674
我們已經在現今我們所使用的
科技中看見這一點了。
06:33
We use smart phones
and digital assistants and Roombas.
112
393767
3880
我們用智慧手機、數位助理,
以及掃地機器人。
06:38
Are they evil?
113
398457
1150
它們邪惡嗎?
06:40
Maybe sometimes.
114
400268
1547
也許有時候。
06:42
Are they beneficial?
115
402664
1333
它們有助益嗎?
06:45
Yes, they're that, too.
116
405005
1533
是的,這也是事實。
06:48
And they're not all the same.
117
408236
1761
且它們並非全都一樣的。
06:50
And there you already see
a light shining on what the future holds.
118
410489
3540
你們已經看到未來
可能性的一絲光芒。
06:54
The future continues on
from what we build and create right now.
119
414942
3619
未來延續的基礎,是我們
現在所建立和創造的。
06:59
We set into motion that domino effect
120
419165
2642
我們開始了骨牌效應,
07:01
that carves out AI's evolutionary path.
121
421831
2600
刻劃出了人工智慧的演進路徑。
07:05
In our time right now,
we shape the AI of tomorrow.
122
425173
2871
我們在現在這時代
形塑未來的人工智慧。
07:08
Technology that immerses us
in augmented realities
123
428566
3699
讓我們能沉浸入擴增實境中的科技,
07:12
bringing to life past worlds.
124
432289
2566
讓過去的世界又活了過來。
07:15
Technology that helps people
to share their experiences
125
435844
4312
協助人們在溝通困難時
07:20
when they have difficulty communicating.
126
440180
2262
還能分享經驗的科技。
07:23
Technology built on understanding
the streaming visual worlds
127
443323
4532
立基在了解串流
視覺世界之上的科技,
07:27
used as technology for self-driving cars.
128
447879
3079
被用來當作自動駕駛汽車的科技。
07:32
Technology built on understanding images
and generating language,
129
452490
3413
立基在了解圖像和產生語言的科技,
07:35
evolving into technology that helps people
who are visually impaired
130
455927
4063
演進成協助視覺損傷者的科技,
07:40
be better able to access the visual world.
131
460014
2800
讓他們更能進入視覺的世界。
07:42
And we also see how technology
can lead to problems.
132
462838
3261
我們也看到了科技如何導致問題。
07:46
We have technology today
133
466885
1428
現今,我們有科技
07:48
that analyzes physical
characteristics we're born with --
134
468337
3835
能夠分析我們天生的身體特徵──
07:52
such as the color of our skin
or the look of our face --
135
472196
3272
比如膚色或面部的外觀──
07:55
in order to determine whether or not
we might be criminals or terrorists.
136
475492
3804
可以用來判斷我們是否
有可能是罪犯或恐怖份子。
07:59
We have technology
that crunches through our data,
137
479688
2905
我們有科技能夠分析我們的資料,
08:02
even data relating
to our gender or our race,
138
482617
2896
甚至和我們的性別
或種族相關的資料,
08:05
in order to determine whether or not
we might get a loan.
139
485537
2865
來決定我們的貸款是否能被核准。
08:09
All that we see now
140
489494
1579
我們現在所看見的一切,
08:11
is a snapshot in the evolution
of artificial intelligence.
141
491097
3617
都是人工智慧演進的約略寫照。
08:15
Because where we are right now,
142
495763
1778
因為我們現在所處的位置,
08:17
is within a moment of that evolution.
143
497565
2238
是在那演進的一個時刻當中。
08:20
That means that what we do now
will affect what happens down the line
144
500690
3802
那表示,我們現在
所做的,會影響到後續
08:24
and in the future.
145
504516
1200
未來發生的事。
08:26
If we want AI to evolve
in a way that helps humans,
146
506063
3951
如果我們想讓人工智慧的
演進方式是對人類有助益的,
08:30
then we need to define
the goals and strategies
147
510038
2801
那麼我們現在就得要
定義目標和策略,
08:32
that enable that path now.
148
512863
1733
來讓那條路成為可能。
08:35
What I'd like to see is something
that fits well with humans,
149
515680
3738
我想要看見的東西是要能夠
和人類、我們的文化,
08:39
with our culture and with the environment.
150
519442
2800
及我們的環境能非常符合的東西。
08:43
Technology that aids and assists
those of us with neurological conditions
151
523435
4484
這種科技要能夠幫助和協助
有神經系統疾病
08:47
or other disabilities
152
527943
1721
或其他殘疾者的人,
08:49
in order to make life
equally challenging for everyone.
153
529688
3216
讓人生對於每個人的
挑戰程度是平等的。
08:54
Technology that works
154
534097
1421
這種科技的運作
08:55
regardless of your demographics
or the color of your skin.
155
535542
3933
不會考量你的
人口統計資料或你的膚色。
09:00
And so today, what I focus on
is the technology for tomorrow
156
540383
4742
所以,現今我著重的是明日的科技
09:05
and for 10 years from now.
157
545149
1733
和十年後的科技。
09:08
AI can turn out in many different ways.
158
548530
2634
產生人工智慧的方式相當多。
09:11
But in this case,
159
551688
1225
但在這個情況中,
09:12
it isn't a self-driving car
without any destination.
160
552937
3328
它並不是沒有目的地的
自動駕駛汽車。
09:16
This is the car that we are driving.
161
556884
2400
它是我們在開的汽車。
09:19
We choose when to speed up
and when to slow down.
162
559953
3595
我們選擇何時要加速何時要減速。
09:23
We choose if we need to make a turn.
163
563572
2400
我們選擇是否要轉彎。
09:26
We choose what the AI
of the future will be.
164
566868
3000
我們選擇將來的
人工智慧會是哪一種。
09:31
There's a vast playing field
165
571186
1337
人工智慧能夠
09:32
of all the things that artificial
intelligence can become.
166
572547
2965
變成各式各樣的東西。
09:36
It will become many things.
167
576064
1800
它會變成許多東西。
09:39
And it's up to us now,
168
579694
1732
現在,決定權在我們,
09:41
in order to figure out
what we need to put in place
169
581450
3061
我們要想清楚我們得要準備什麼,
09:44
to make sure the outcomes
of artificial intelligence
170
584535
3807
來確保人工智慧的結果
09:48
are the ones that will be
better for all of us.
171
588366
3066
會是對所有人都更好的結果。
09:51
Thank you.
172
591456
1150
謝謝。
09:52
(Applause)
173
592630
2187
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。