請雙擊下方英文字幕播放視頻。
00:00
Translator: Ivana Korom
Reviewer: Joanna Pietrulewicz
0
0
7000
譯者: Lilian Chiu
審譯者: Bruce Sung
00:12
How many decisions
have been made about you today,
1
12875
3768
今天,有多少與你有關的決策,
00:16
or this week or this year,
2
16667
2601
或這週,或今年,
00:19
by artificial intelligence?
3
19292
1958
是由人工智慧所做的?
00:22
I build AI for a living
4
22958
1685
我的工作是建造人工智慧,
00:24
so, full disclosure, I'm kind of a nerd.
5
24667
3017
所以,不隱瞞大家,我是個怪胎。
00:27
And because I'm kind of a nerd,
6
27708
2393
因為我是個怪胎,
00:30
wherever some new news story comes out
7
30125
2351
每當有新的新聞報導出來,
00:32
about artificial intelligence
stealing all our jobs,
8
32500
3434
內容有談到人工智慧
偷走我們所有的工作,
00:35
or robots getting citizenship
of an actual country,
9
35958
4185
或是機器人取得
實際國家的公民權,
00:40
I'm the person my friends
and followers message
10
40167
3142
我的朋友和追隨者
就會發訊息給我,
00:43
freaking out about the future.
11
43333
1542
表示對於未來的擔憂。
00:45
We see this everywhere.
12
45833
2101
這種狀況處處可見。
00:47
This media panic that
our robot overlords are taking over.
13
47958
4893
這種認為機器人統治者
會接管世界的媒體恐慌。
00:52
We could blame Hollywood for that.
14
52875
1917
我們可以怪罪於好萊塢。
00:56
But in reality, that's not the problem
we should be focusing on.
15
56125
4125
但,在現實中,我們不該
把焦點放在那個問題上。
01:01
There is a more pressing danger,
a bigger risk with AI,
16
61250
3643
還有更迫切的危機,
人工智慧有個更大的風險,
01:04
that we need to fix first.
17
64917
1583
我們應該要先解決它。
01:07
So we are back to this question:
18
67417
2309
所以,回到這個問題:
01:09
How many decisions
have been made about you today by AI?
19
69750
4708
今天,人工智慧做了
多少關於你的決策?
01:15
And how many of these
20
75792
1976
這些決策中,有多少
01:17
were based on your gender,
your race or your background?
21
77792
4500
是根據你的性別、你的種族,
或你的背景所做出來的?
01:24
Algorithms are being used all the time
22
84500
2768
演算法常常被拿來使用,
01:27
to make decisions about who we are
and what we want.
23
87292
3833
做出關於我們是什麼人、
我們想要什麼的相關決策。
01:32
Some of the women in this room
will know what I'm talking about
24
92208
3643
這間房間中有一些女性
知道我在說什麼,
01:35
if you've been made to sit through
those pregnancy test adverts on YouTube
25
95875
3768
如果你曾經坐在電腦前
看 YouTube 時,
被迫看完驗孕測試的廣告,
且發生過約一千次的話。
01:39
like 1,000 times.
26
99667
2059
01:41
Or you've scrolled past adverts
of fertility clinics
27
101750
2851
或者,你曾經在滑手機
看臉書動態時報時
01:44
on your Facebook feed.
28
104625
2042
一直看到不孕症診所的廣告。
01:47
Or in my case, Indian marriage bureaus.
29
107625
2393
或者,我的例子則是看到
印度婚姻介紹所的廣告。
01:50
(Laughter)
30
110042
1267
(笑聲)
01:51
But AI isn't just being used
to make decisions
31
111333
2976
但,人工智慧不只是被用來判定
01:54
about what products we want to buy
32
114333
2601
我們想要買什麼產品,
01:56
or which show we want to binge watch next.
33
116958
2500
或是我們接下來想要看追哪齣劇。
02:01
I wonder how you'd feel about someone
who thought things like this:
34
121042
5184
我很好奇,對於這樣想的人,
你們有何感覺:
02:06
"A black or Latino person
35
126250
1934
「黑人或拉丁裔的人
02:08
is less likely than a white person
to pay off their loan on time."
36
128208
4125
準時還清貸款的可能性
沒有白人高。」
02:13
"A person called John
makes a better programmer
37
133542
2809
「名字叫做約翰的人,
和名叫瑪莉的人相比,
02:16
than a person called Mary."
38
136375
1667
會是比較好的程式設計師。」
02:19
"A black man is more likely to be
a repeat offender than a white man."
39
139250
5083
「比起白人,黑人比較
有可能會再次犯罪。」
02:26
You're probably thinking,
40
146958
1268
你們可能在想:
02:28
"Wow, that sounds like a pretty sexist,
racist person," right?
41
148250
3750
「哇,那聽起來是性別主義
和種族主義的人會說的話」對吧?
02:33
These are some real decisions
that AI has made very recently,
42
153000
4851
上述這些是人工智慧
近期所做出的一些決策,
02:37
based on the biases
it has learned from us,
43
157875
2934
決策依據是它向我們學來的偏見,
02:40
from the humans.
44
160833
1250
向人類學來的。
02:43
AI is being used to help decide
whether or not you get that job interview;
45
163750
4809
人工智慧被用來協助判斷
你是否能參加工作面試;
02:48
how much you pay for your car insurance;
46
168583
2393
你的汽車保險保費是多少錢;
02:51
how good your credit score is;
47
171000
1893
你的信用評等有多好;
02:52
and even what rating you get
in your annual performance review.
48
172917
3125
甚至你在年度考績中
得到的評級是多少。
02:57
But these decisions
are all being filtered through
49
177083
3143
但這些決策都會先被過濾過,
03:00
its assumptions about our identity,
our race, our gender, our age.
50
180250
5875
過濾依據就是關於我們的身分、
種族、性別、年齡等的假設。
03:08
How is that happening?
51
188250
2268
為什麼會發生這種事?
03:10
Now, imagine an AI is helping
a hiring manager
52
190542
3517
想像一下,人工智慧在協助
一位有人才需求的經理
03:14
find the next tech leader in the company.
53
194083
2851
尋找該公司的下一位技術主管。
03:16
So far, the manager
has been hiring mostly men.
54
196958
3101
目前,這位經理僱用的人
大部分都是男性。
03:20
So the AI learns men are more likely
to be programmers than women.
55
200083
4750
所以人工智慧學到的是,男性
比女性更有可能成為程式設計師。
03:25
And it's a very short leap from there to:
56
205542
2892
很容易就會從
這個現象直接下結論:
03:28
men make better programmers than women.
57
208458
2042
男性程式設計師比女性好。
03:31
We have reinforced
our own bias into the AI.
58
211417
3726
我們把我們自己的偏見
灌輸給人工智慧。
03:35
And now, it's screening out
female candidates.
59
215167
3625
現在,它就會把
女性候選人給篩掉。
03:40
Hang on, if a human
hiring manager did that,
60
220917
3017
等等,如果人類的經理這樣做,
03:43
we'd be outraged, we wouldn't allow it.
61
223958
2351
我們會很火大,
我們不會容忍這種事。
03:46
This kind of gender
discrimination is not OK.
62
226333
3476
這種性別歧視是不對的。
03:49
And yet somehow,
AI has become above the law,
63
229833
4518
但,人工智慧卻以
某種方式超越了法律,
03:54
because a machine made the decision.
64
234375
2083
因為那個決策是機器做出來的。
03:57
That's not it.
65
237833
1518
不只如此。
03:59
We are also reinforcing our bias
in how we interact with AI.
66
239375
4875
我們和人工智慧的互動,
也加強了我們自己的偏見。
04:04
How often do you use a voice assistant
like Siri, Alexa or even Cortana?
67
244917
5976
你們有多常使用語音助手,比如
Siri、Alexa,或甚至 Cortana?
04:10
They all have two things in common:
68
250917
2559
它們全都有兩項共通點:
04:13
one, they can never get my name right,
69
253500
3101
第一,它們總是把我的名字弄錯,
04:16
and second, they are all female.
70
256625
2667
第二,它們都是女性。
04:20
They are designed to be
our obedient servants,
71
260417
2767
它們被設計為順從我們的僕人,
04:23
turning your lights on and off,
ordering your shopping.
72
263208
3250
幫你開燈、關燈,幫你下單購物。
04:27
You get male AIs too,
but they tend to be more high-powered,
73
267125
3309
也會有男性的人工智慧,
但通常它們的功能比較強,
04:30
like IBM Watson,
making business decisions,
74
270458
3059
比如 IBM 的 Watson,
做的是商業決策,
04:33
Salesforce Einstein
or ROSS, the robot lawyer.
75
273541
3792
又如 Salesforce Einstein,
或是機器律師 ROSS 。
04:38
So poor robots, even they suffer
from sexism in the workplace.
76
278208
4060
可憐的機器人,連它們也會
遇到工作場所的性別主義。
04:42
(Laughter)
77
282292
1125
(笑聲)
04:44
Think about how these two things combine
78
284542
2851
想想看,當這兩點結合起來時,
04:47
and affect a kid growing up
in today's world around AI.
79
287417
5309
會如何影響現今世界中與人工智慧
一同生活的孩子成長。
04:52
So they're doing some research
for a school project
80
292750
2934
所以,他們為了一項學校
專案計畫做了些研究,
04:55
and they Google images of CEO.
81
295708
3018
他們用 Google 搜尋了
執行長的形象。
04:58
The algorithm shows them
results of mostly men.
82
298750
2893
演算法呈現給他們看的搜尋結果,
大部分都是男性。
05:01
And now, they Google personal assistant.
83
301667
2559
他們又搜尋了個人助理。
05:04
As you can guess,
it shows them mostly females.
84
304250
3434
你們可以猜到,呈現出來的
搜尋結果大部分是女性。
05:07
And then they want to put on some music,
and maybe order some food,
85
307708
3601
接著,他們想要播放音樂,
也許再點一些食物來吃,
05:11
and now, they are barking orders
at an obedient female voice assistant.
86
311333
6584
現在,他們便大聲喊出命令,
要順從的女性語音助手去做。
05:19
Some of our brightest minds
are creating this technology today.
87
319542
5309
一些最聰明的天才們
創造出現今的這種技術。
05:24
Technology that they could have created
in any way they wanted.
88
324875
4184
他們可以依他們想要的
任何方式來創造這種技術。
05:29
And yet, they have chosen to create it
in the style of 1950s "Mad Man" secretary.
89
329083
5685
但,他們選擇採用五〇年代
《廣告狂人》中的秘書風格。
05:34
Yay!
90
334792
1500
好呀!
05:36
But OK, don't worry,
91
336958
1310
但,好,別擔心,
05:38
this is not going to end
with me telling you
92
338292
2059
演說的結尾不會是我告訴各位
05:40
that we are all heading towards
sexist, racist machines running the world.
93
340375
3477
我們正在邁向一個由性別主義、
種族主義的機器所統治的世界。
05:44
The good news about AI
is that it is entirely within our control.
94
344792
5791
關於人工智慧的好消息是,
它完全在我們的掌控當中。
05:51
We get to teach the right values,
the right ethics to AI.
95
351333
4000
我們可以教導人工智慧
正確的價值觀、正確的倫理。
05:56
So there are three things we can do.
96
356167
2184
有三件事是我們可以做的。
05:58
One, we can be aware of our own biases
97
358375
3351
第一,我們可以意識到
我們自己有偏見存在,
06:01
and the bias in machines around us.
98
361750
2726
以及我們身邊的機器也有偏見。
06:04
Two, we can make sure that diverse teams
are building this technology.
99
364500
4518
第二,我們可以確保這項技術
是由多樣化的團隊來建造。
06:09
And three, we have to give it
diverse experiences to learn from.
100
369042
4916
第三,我們要提供多樣化的經驗,
讓這項技術從中學習。
06:14
I can talk about the first two
from personal experience.
101
374875
3309
我可以從個人經歷來談前兩點。
06:18
When you work in technology
102
378208
1435
當你在科技業工作,
06:19
and you don't look like
a Mark Zuckerberg or Elon Musk,
103
379667
3392
且你看起來並不像是
馬克祖克柏或伊隆馬斯克,
06:23
your life is a little bit difficult,
your ability gets questioned.
104
383083
3750
你的生活就會有一點辛苦,
你的能力會被質疑。
06:27
Here's just one example.
105
387875
1393
這只是一個例子。
06:29
Like most developers,
I often join online tech forums
106
389292
3726
和大部分的開發者一樣,
我通常會加入線上技術討論區,
06:33
and share my knowledge to help others.
107
393042
3226
分享我的知識來協助他人。
06:36
And I've found,
108
396292
1309
而我發現,
06:37
when I log on as myself,
with my own photo, my own name,
109
397625
3976
當我用自己登入,放我自己的
照片,用我自己的名字,
06:41
I tend to get questions
or comments like this:
110
401625
4601
我常常會得到這樣的問題或意見:
06:46
"What makes you think
you're qualified to talk about AI?"
111
406250
3000
「你怎麼會認為
你有資格談論人工智慧?」
06:50
"What makes you think
you know about machine learning?"
112
410458
3476
「你怎麼會認為
你了解機器學習?」
06:53
So, as you do, I made a new profile,
113
413958
3435
所以,跟大家一樣,
我會做個新的個人檔案,
06:57
and this time, instead of my own picture,
I chose a cat with a jet pack on it.
114
417417
4851
這次,我不放自己的照片,
我選的照片是一隻背著
噴氣飛行器的貓。
07:02
And I chose a name
that did not reveal my gender.
115
422292
2458
我選用的名字看不出性別。
07:05
You can probably guess
where this is going, right?
116
425917
2726
你們應該猜得出後續發展,對吧?
07:08
So, this time, I didn't get any of those
patronizing comments about my ability
117
428667
6392
所以,這次,那些高人一等的人
完全沒有對我的能力提出意見,
07:15
and I was able to actually
get some work done.
118
435083
3334
我還真的能完成一些事。
07:19
And it sucks, guys.
119
439500
1851
各位,這真的很鳥。
07:21
I've been building robots since I was 15,
120
441375
2476
我從十五歲時就在建造機器人了,
07:23
I have a few degrees in computer science,
121
443875
2268
我有幾個資訊科學的學位,
07:26
and yet, I had to hide my gender
122
446167
2434
但,我還是得隱瞞我的性別,
07:28
in order for my work
to be taken seriously.
123
448625
2250
我所做的事才會被認真看待。
07:31
So, what's going on here?
124
451875
1893
這是怎麼回事?
07:33
Are men just better
at technology than women?
125
453792
3208
在科技上,男人就是
比女人厲害嗎?
07:37
Another study found
126
457917
1559
另一項研究發現,
07:39
that when women coders on one platform
hid their gender, like myself,
127
459500
4934
在平台上,當女性編碼者
像我這樣隱瞞自己的性別時,
07:44
their code was accepted
four percent more than men.
128
464458
3250
她們的程式碼被接受的
比率比男性高 4%。
07:48
So this is not about the talent.
129
468542
2916
重點並不是才華。
07:51
This is about an elitism in AI
130
471958
2893
重點是人工智慧領域的精英主義,
07:54
that says a programmer
needs to look like a certain person.
131
474875
2792
它說,程式設計師必須要
看起來像是某種人。
07:59
What we really need to do
to make AI better
132
479375
3101
若想要讓人工智慧更好,
我們需要做的事情
08:02
is bring people
from all kinds of backgrounds.
133
482500
3042
是集合各種背景的人。
08:06
We need people who can
write and tell stories
134
486542
2559
我們需要能夠寫故事、說故事的人
08:09
to help us create personalities of AI.
135
489125
2167
來協助我們創造出
人工智慧的人格。
08:12
We need people who can solve problems.
136
492208
2042
我們需要能夠解決問題的人。
08:15
We need people
who face different challenges
137
495125
3768
我們需要能夠面對不同挑戰的人,
08:18
and we need people who can tell us
what are the real issues that need fixing
138
498917
5351
我們需要能夠告訴我們
真正需要修正的問題是什麼,
08:24
and help us find ways
that technology can actually fix it.
139
504292
3041
且協助我們想辦法
用科技來修正它的人。
08:29
Because, when people
from diverse backgrounds come together,
140
509833
3726
因為,當來自多樣化
背景的人集結在一起,
08:33
when we build things in the right way,
141
513583
2143
當我們用對的方式建造新東西時,
08:35
the possibilities are limitless.
142
515750
2042
就會有無限的可能性。
08:38
And that's what I want to end
by talking to you about.
143
518750
3309
我希望用這一點
來結束今天的演說。
08:42
Less racist robots, less machines
that are going to take our jobs --
144
522083
4225
少談種族主義的機器人、
少談機器會搶走我們的工作——
08:46
and more about what technology
can actually achieve.
145
526332
3125
多談科技能夠達成什麼。
08:50
So, yes, some of the energy
in the world of AI,
146
530292
3434
所以,是的,在人工智慧
世界中的某些能量,
08:53
in the world of technology
147
533750
1393
在科技世界中的某些能量,
08:55
is going to be about
what ads you see on your stream.
148
535167
4267
會被用來決定放什麼廣告
到你的串流中。
08:59
But a lot of it is going towards
making the world so much better.
149
539458
5209
但有很大一部分的目的
會是要讓世界變得更好。
09:05
Think about a pregnant woman
in the Democratic Republic of Congo,
150
545500
3768
想想看,在剛果民主
共和國的懷孕女子,
09:09
who has to walk 17 hours
to her nearest rural prenatal clinic
151
549292
4184
她得要走十七小時的路,
才能到達最近的鄉村婦產科診所,
09:13
to get a checkup.
152
553500
1851
去做一次檢查。
09:15
What if she could get diagnosis
on her phone, instead?
153
555375
2917
如果她能夠改用她的手機
來取得診斷呢?
09:19
Or think about what AI could do
154
559750
1809
或者,想想人工智慧能做什麼,
09:21
for those one in three women
in South Africa
155
561583
2726
來協助解決南非有三分之一女性
09:24
who face domestic violence.
156
564333
2125
要面對家暴的問題。
09:27
If it wasn't safe to talk out loud,
157
567083
2726
如果大聲談論並不安全,
09:29
they could get an AI service
to raise alarm,
158
569833
2476
她們可以透過人工智慧服務來求援,
09:32
get financial and legal advice.
159
572333
2459
取得財務和法律建議。
09:35
These are all real examples of projects
that people, including myself,
160
575958
5018
這些例子都是目前
有人在利用人工智慧
09:41
are working on right now, using AI.
161
581000
2500
進行的專案計畫,包括我在內。
09:45
So, I'm sure in the next couple of days
there will be yet another news story
162
585542
3601
我相信,在接下來幾天,
還會有另一則新聞報導,
09:49
about the existential risk,
163
589167
2684
談及生存危機、
09:51
robots taking over
and coming for your jobs.
164
591875
2434
機器人即將來搶走你的工作。
09:54
(Laughter)
165
594333
1018
(笑聲)
09:55
And when something like that happens,
166
595375
2309
當發生這樣的狀況時,
09:57
I know I'll get the same messages
worrying about the future.
167
597708
3601
我知道我又會收到
關於擔心未來的訊息。
10:01
But I feel incredibly positive
about this technology.
168
601333
3667
但我對這項技術的感受
是非常正面的。
10:07
This is our chance to remake the world
into a much more equal place.
169
607458
5959
這是一個機會,
我們可以把世界重建,
成為更平等的地方。
10:14
But to do that, we need to build it
the right way from the get go.
170
614458
4000
但,若想做到這個目標,
打從一開始就要用對方式。
10:19
We need people of different genders,
races, sexualities and backgrounds.
171
619667
5083
我們需要不同性別、
種族、性向,和背景的人。
10:26
We need women to be the makers
172
626458
2476
我們需要女性來當創造者,
10:28
and not just the machines
who do the makers' bidding.
173
628958
3000
不只是會照著創造者的
命令做事的機器。
10:33
We need to think very carefully
what we teach machines,
174
633875
3768
我們得要非常小心地思考
我們要教導機器什麼,
10:37
what data we give them,
175
637667
1642
要給它們什麼資料,
10:39
so they don't just repeat
our own past mistakes.
176
639333
3125
以免它們重蹈我們過去的覆轍。
10:44
So I hope I leave you
thinking about two things.
177
644125
3542
我希望留下兩件事讓各位思考。
10:48
First, I hope you leave
thinking about bias today.
178
648542
4559
第一,我希望大家離開這裡之後,
能想想現今的偏見。
10:53
And that the next time
you scroll past an advert
179
653125
3184
下次當你滑手機看到廣告時,
10:56
that assumes you are interested
in fertility clinics
180
656333
2810
且廣告內容是假設
你想了解不孕症診所
10:59
or online betting websites,
181
659167
2851
或線上賭博網站,
11:02
that you think and remember
182
662042
2017
你就要思考一下並想起來,
11:04
that the same technology is assuming
that a black man will reoffend.
183
664083
4625
這項技術也同樣會假設
黑人會再犯罪。
11:09
Or that a woman is more likely
to be a personal assistant than a CEO.
184
669833
4167
或者女性比較有可能
成為個人助理而非執行長。
11:14
And I hope that reminds you
that we need to do something about it.
185
674958
3709
我希望那能夠提醒各位,
我們得要採取行動。
11:20
And second,
186
680917
1851
第二,
11:22
I hope you think about the fact
187
682792
1892
我希望大家能想想,
11:24
that you don't need to look a certain way
188
684708
1976
你並不需要有某種外表
11:26
or have a certain background
in engineering or technology
189
686708
3851
或某種工程或科技背景,
11:30
to create AI,
190
690583
1268
才能創造人工智慧,
11:31
which is going to be
a phenomenal force for our future.
191
691875
2875
它將會是我們未來的
一股驚人力量。
11:36
You don't need to look
like a Mark Zuckerberg,
192
696166
2143
你不需要看起來像馬克祖克柏,
11:38
you can look like me.
193
698333
1250
你可以看起來像我。
11:41
And it is up to all of us in this room
194
701250
2893
要靠我們這間房間的所有人,
11:44
to convince the governments
and the corporations
195
704167
2726
來說服政府和企業,
11:46
to build AI technology for everyone,
196
706917
2892
為每個人建造人工智慧技術,
11:49
including the edge cases.
197
709833
2393
包括邊緣的個案。
11:52
And for us all to get education
198
712250
2059
而我們所有人將來都應該要
11:54
about this phenomenal
technology in the future.
199
714333
2375
接受關於這項重大技術的教育。
11:58
Because if we do that,
200
718167
2017
因為,如果能這麼做,
12:00
then we've only just scratched the surface
of what we can achieve with AI.
201
720208
4893
那麼我們還能夠用人工智慧
做出更多了不起的事。
12:05
Thank you.
202
725125
1268
謝謝。
12:06
(Applause)
203
726417
2708
(掌聲)
New videos
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。