The era of blind faith in big data must end | Cathy O'Neil

251,236 views ・ 2017-09-07

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: NAN-KUN WU
00:12
Algorithms are everywhere.
0
12795
1596
演算法無所不在。
00:15
They sort and separate the winners from the losers.
1
15931
3125
它們能把贏家和輸家區分開來。
00:19
The winners get the job
2
19839
2264
贏家能得到工作,
00:22
or a good credit card offer.
3
22127
1743
或是好的信用卡方案。
00:23
The losers don't even get an interview
4
23894
2651
輸家連面試的機會都沒有,
00:27
or they pay more for insurance.
5
27410
1777
或是他們的保險費比較高。
00:30
We're being scored with secret formulas that we don't understand
6
30017
3549
我們都被我們不了解的 秘密方程式在評分,
00:34
that often don't have systems of appeal.
7
34495
3217
且那些方程式通常 都沒有申訴體制。
00:39
That begs the question:
8
39060
1296
問題就來了:
00:40
What if the algorithms are wrong?
9
40380
2913
如果演算法是錯的怎麼辦?
00:44
To build an algorithm you need two things:
10
44920
2040
要建立一個演算法,需要兩樣東西:
00:46
you need data, what happened in the past,
11
46984
1981
需要資料,資料是過去發生的事,
00:48
and a definition of success,
12
48989
1561
還需要對成功的定義,
00:50
the thing you're looking for and often hoping for.
13
50574
2457
也就是你在找的東西、 你想要的東西。
00:53
You train an algorithm by looking, figuring out.
14
53055
5037
你透過尋找和計算的方式 來訓練一個演算法。
00:58
The algorithm figures out what is associated with success.
15
58116
3419
演算法會算出什麼和成功有相關性。
01:01
What situation leads to success?
16
61559
2463
什麼樣的情況會導致成功?
01:04
Actually, everyone uses algorithms.
17
64701
1762
其實,人人都在用演算法。
01:06
They just don't formalize them in written code.
18
66487
2718
他們只是沒把演算法寫為程式。
01:09
Let me give you an example.
19
69229
1348
讓我舉個例子。
01:10
I use an algorithm every day to make a meal for my family.
20
70601
3316
我每天都用演算法 來為我的家庭做飯。
01:13
The data I use
21
73941
1476
我用的資料
01:16
is the ingredients in my kitchen,
22
76214
1659
是我廚房中的原料、
01:17
the time I have,
23
77897
1527
我擁有的時間、
01:19
the ambition I have,
24
79448
1233
我的野心、
01:20
and I curate that data.
25
80705
1709
我把這些資料拿來做策劃。
01:22
I don't count those little packages of ramen noodles as food.
26
82438
4251
我不把那一小包小包的 拉麵條視為是食物。
01:26
(Laughter)
27
86713
1869
(笑聲)
01:28
My definition of success is:
28
88606
1845
我對成功的定義是:
01:30
a meal is successful if my kids eat vegetables.
29
90475
2659
如果我的孩子吃了蔬菜, 這頓飯就算成功。
01:34
It's very different from if my youngest son were in charge.
30
94001
2854
但如果我的小兒子主導時 一切就不同了。
01:36
He'd say success is if he gets to eat lots of Nutella.
31
96879
2788
他會說,如果能吃到很多 能多益(巧克力榛果醬)就算成功。
01:40
But I get to choose success.
32
100999
2226
但我能選擇什麼才算成功。
01:43
I am in charge. My opinion matters.
33
103249
2707
我是主導的人,我的意見才重要。
01:45
That's the first rule of algorithms.
34
105980
2675
那是演算法的第一條規則。
01:48
Algorithms are opinions embedded in code.
35
108679
3180
演算法是被嵌入程式中的意見。
01:53
It's really different from what you think most people think of algorithms.
36
113382
3663
這和你認為大部份人 對演算法的看法很不一樣。
01:57
They think algorithms are objective and true and scientific.
37
117069
4504
他們認為演算法是 客觀的、真實的、科學的。
02:02
That's a marketing trick.
38
122207
1699
那是種行銷技倆。
02:05
It's also a marketing trick
39
125089
2125
還有一種行銷技倆是
02:07
to intimidate you with algorithms,
40
127238
3154
用演算法來威脅你,
02:10
to make you trust and fear algorithms
41
130416
3661
讓你相信並懼怕演算法,
02:14
because you trust and fear mathematics.
42
134101
2018
因為你相信並懼怕數學。
02:17
A lot can go wrong when we put blind faith in big data.
43
137387
4830
當我們盲目相信大數據時, 很多地方都可能出錯。
02:23
This is Kiri Soares. She's a high school principal in Brooklyn.
44
143504
3373
這位是琦莉索瑞斯, 她是布魯克林的高中校長。
02:26
In 2011, she told me her teachers were being scored
45
146901
2586
2011 年,她告訴我, 用來評分她的老師的演算法
02:29
with a complex, secret algorithm
46
149511
2727
是一種複雜的秘密演算法,
02:32
called the "value-added model."
47
152262
1489
叫做「加值模型」。
02:34
I told her, "Well, figure out what the formula is, show it to me.
48
154325
3092
我告訴她:「找出那方程式 是什麼,給我看,
02:37
I'm going to explain it to you."
49
157441
1541
我就會解釋給你聽。」
02:39
She said, "Well, I tried to get the formula,
50
159006
2141
她說:「嗯,我試過取得方程式了,
02:41
but my Department of Education contact told me it was math
51
161171
2772
但教育部聯絡人告訴我, 那方程式是數學,
02:43
and I wouldn't understand it."
52
163967
1546
我也看不懂的。」
02:47
It gets worse.
53
167086
1338
還有更糟的。
02:48
The New York Post filed a Freedom of Information Act request,
54
168448
3530
紐約郵報提出了一項 資訊自由法案的請求,
02:52
got all the teachers' names and all their scores
55
172002
2959
取得有所有老師的名字 以及他們的分數,
02:54
and they published them as an act of teacher-shaming.
56
174985
2782
郵報把這些都刊出來, 用來羞辱老師。
02:58
When I tried to get the formulas, the source code, through the same means,
57
178904
3860
當我試著透過同樣的手段 來找出方程式、原始碼,
03:02
I was told I couldn't.
58
182788
2149
我被告知我不可能辦到。
03:04
I was denied.
59
184961
1236
我被拒絕了。
03:06
I later found out
60
186221
1174
我後來發現,
03:07
that nobody in New York City had access to that formula.
61
187419
2866
紐約市中沒有人能取得那方程式。
03:10
No one understood it.
62
190309
1305
沒有人了解它。
03:13
Then someone really smart got involved, Gary Rubinstein.
63
193749
3224
有個很聰明的人介入: 蓋瑞魯賓斯坦。
03:16
He found 665 teachers from that New York Post data
64
196997
3621
他發現紐約郵報資料中 有 665 名老師
03:20
that actually had two scores.
65
200642
1866
其實有兩個分數。
03:22
That could happen if they were teaching
66
202532
1881
如果他們是在教七年級
03:24
seventh grade math and eighth grade math.
67
204437
2439
及八年級數學,是有可能發生。
03:26
He decided to plot them.
68
206900
1538
他決定把他們用圖畫出來。
03:28
Each dot represents a teacher.
69
208462
1993
每一個點代表一個老師。
03:30
(Laughter)
70
210924
2379
(笑聲)
03:33
What is that?
71
213327
1521
那是什麼?
03:34
(Laughter)
72
214872
1277
(笑聲)
03:36
That should never have been used for individual assessment.
73
216173
3446
那絕對不該被用來做個人評估用。
03:39
It's almost a random number generator.
74
219643
1926
它幾乎就是個隨機數產生器。
03:41
(Applause)
75
221593
2946
(掌聲)
03:44
But it was.
76
224563
1162
但它的確被用了。
03:45
This is Sarah Wysocki.
77
225749
1176
這是莎拉薇沙琪,
03:46
She got fired, along with 205 other teachers,
78
226949
2175
她和其他 205 名老師都被開除了,
03:49
from the Washington, DC school district,
79
229148
2662
都是在華盛頓特區的學區,
03:51
even though she had great recommendations from her principal
80
231834
2909
即使她有校長及 學童家長的強力推薦,
03:54
and the parents of her kids.
81
234767
1428
還是被開除了。
03:57
I know what a lot of you guys are thinking,
82
237210
2032
我很清楚你們在想什麼,
03:59
especially the data scientists, the AI experts here.
83
239266
2487
特別是這裡的資料科學家 及人工智慧專家。
04:01
You're thinking, "Well, I would never make an algorithm that inconsistent."
84
241777
4226
你們在想:「我絕對不會寫出 那麼不一致的演算法。」
04:06
But algorithms can go wrong,
85
246673
1683
但演算法是可能出錯的,
04:08
even have deeply destructive effects with good intentions.
86
248380
4598
即使出自好意 仍可能產生毀滅性的效應。
04:14
And whereas an airplane that's designed badly
87
254351
2379
設計得很糟的飛機墜機,
04:16
crashes to the earth and everyone sees it,
88
256754
2001
每個人都會看到;
04:18
an algorithm designed badly
89
258779
1850
可是,設計很糟的演算法,
04:22
can go on for a long time, silently wreaking havoc.
90
262065
3865
可以一直運作很長的時間, 靜靜地製造破壞或混亂。
04:27
This is Roger Ailes.
91
267568
1570
這位是羅傑艾爾斯。
04:29
(Laughter)
92
269162
2000
(笑聲)
04:32
He founded Fox News in 1996.
93
272344
2388
他在 1996 年成立了 Fox News。
04:35
More than 20 women complained about sexual harassment.
94
275256
2581
有超過二十位女性投訴性騷擾。
04:37
They said they weren't allowed to succeed at Fox News.
95
277861
3235
她們說,她們在 Fox News 不被允許成功。
04:41
He was ousted last year, but we've seen recently
96
281120
2520
他去年被攆走了,但我們看到近期
04:43
that the problems have persisted.
97
283664
2670
這個問題仍然存在。
04:47
That begs the question:
98
287474
1400
這就帶來一個問題:
04:48
What should Fox News do to turn over another leaf?
99
288898
2884
Fox News 該做什麼才能改過自新?
04:53
Well, what if they replaced their hiring process
100
293065
3041
如果他們把僱用的流程換掉,
04:56
with a machine-learning algorithm?
101
296130
1654
換成機器學習演算法呢?
04:57
That sounds good, right?
102
297808
1595
聽起來很好,對嗎?
04:59
Think about it.
103
299427
1300
想想看。
05:00
The data, what would the data be?
104
300751
2105
資料,資料會是什麼?
05:02
A reasonable choice would be the last 21 years of applications to Fox News.
105
302880
4947
一個合理的選擇會是 Fox News 過去 21 年間收到的申請。
05:07
Reasonable.
106
307851
1502
很合理。
05:09
What about the definition of success?
107
309377
1938
成功的定義呢?
05:11
Reasonable choice would be,
108
311741
1324
合理的選擇會是,
05:13
well, who is successful at Fox News?
109
313089
1778
在 Fox News 有誰是成功的?
05:14
I guess someone who, say, stayed there for four years
110
314891
3580
我猜是在那邊待了四年、
05:18
and was promoted at least once.
111
318495
1654
且至少升遷過一次的人。
05:20
Sounds reasonable.
112
320636
1561
聽起來很合理。
05:22
And then the algorithm would be trained.
113
322221
2354
接著,演算法就會被訓練。
05:24
It would be trained to look for people to learn what led to success,
114
324599
3877
它會被訓練來找人, 尋找什麼導致成功,
05:29
what kind of applications historically led to success
115
329039
4318
在過去怎樣的申請書會導致成功,
05:33
by that definition.
116
333381
1294
用剛剛的成功定義。
05:36
Now think about what would happen
117
336020
1775
想想看會發生什麼事,
05:37
if we applied that to a current pool of applicants.
118
337819
2555
如果我們把它用到 目前的一堆申請書上。
05:40
It would filter out women
119
340939
1629
它會把女性過濾掉,
05:43
because they do not look like people who were successful in the past.
120
343483
3930
因為在過去,女性 並不像是會成功的人。
05:51
Algorithms don't make things fair
121
351572
2537
如果只是漫不經心、 盲目地運用演算法,
05:54
if you just blithely, blindly apply algorithms.
122
354133
2694
它們並不會讓事情變公平。
05:56
They don't make things fair.
123
356851
1482
演算法不會讓事情變公平。
05:58
They repeat our past practices,
124
358357
2128
它們會重覆我們過去的做法,
06:00
our patterns.
125
360509
1183
我們的模式。
06:01
They automate the status quo.
126
361716
1939
它們會把現狀給自動化。
06:04
That would be great if we had a perfect world,
127
364538
2389
如果我們有個完美的 世界,那就很好了,
06:07
but we don't.
128
367725
1312
但世界不完美。
06:09
And I'll add that most companies don't have embarrassing lawsuits,
129
369061
4102
我還要補充,大部份公司 沒有難堪的訴訟,
06:14
but the data scientists in those companies
130
374266
2588
但在那些公司中的資料科學家
06:16
are told to follow the data,
131
376878
2189
被告知要遵從資料,
06:19
to focus on accuracy.
132
379091
2143
著重正確率。
06:22
Think about what that means.
133
382093
1381
想想那意味著什麼。
06:23
Because we all have bias, it means they could be codifying sexism
134
383498
4027
因為我們都有偏見,那就意味著, 他們可能會把性別偏見
06:27
or any other kind of bigotry.
135
387549
1836
或其他偏執給寫到程式中,
06:31
Thought experiment,
136
391308
1421
來做個思想實驗,
06:32
because I like them:
137
392753
1509
因為我喜歡思想實驗:
06:35
an entirely segregated society --
138
395394
2975
一個完全種族隔離的社會,
06:40
racially segregated, all towns, all neighborhoods
139
400067
3328
所有的城鎮、所有的街坊 都做了種族隔離,
06:43
and where we send the police only to the minority neighborhoods
140
403419
3037
我們只會針對少數種族 住的街坊派出警力
06:46
to look for crime.
141
406480
1193
來尋找犯罪。
06:48
The arrest data would be very biased.
142
408271
2219
逮捕的資料會非常偏頗。
06:51
What if, on top of that, we found the data scientists
143
411671
2575
如果再加上,我們 找到了資料科學家,
06:54
and paid the data scientists to predict where the next crime would occur?
144
414270
4161
付錢給他們,要他們預測下次 犯罪會發生在哪裡,會如何?
06:59
Minority neighborhood.
145
419095
1487
答案:少數種族的街坊。
07:01
Or to predict who the next criminal would be?
146
421105
3125
或是去預測下一位犯人會是誰?
07:04
A minority.
147
424708
1395
答案:少數族裔。
07:07
The data scientists would brag about how great and how accurate
148
427769
3541
資料科學家會吹噓他們的的模型
07:11
their model would be,
149
431334
1297
有多了不起、多精準,
07:12
and they'd be right.
150
432655
1299
他們是對的。
07:15
Now, reality isn't that drastic, but we do have severe segregations
151
435771
4615
現實沒那麼極端,但在許多 城鎮和城市中,我們的確有
07:20
in many cities and towns,
152
440410
1287
嚴重的種族隔離,
07:21
and we have plenty of evidence
153
441721
1893
我們有很多證據可證明
07:23
of biased policing and justice system data.
154
443638
2688
執法和司法資料是偏頗的。
07:27
And we actually do predict hotspots,
155
447452
2815
我們確實預測了熱點,
07:30
places where crimes will occur.
156
450291
1530
犯罪會發生的地方。
07:32
And we do predict, in fact, the individual criminality,
157
452221
3866
事實上,我們確實預測了 個別的犯罪行為,
07:36
the criminality of individuals.
158
456111
1770
個人的犯罪行為。
07:38
The news organization ProPublica recently looked into
159
458792
3963
新聞組織 ProPublica 近期調查了
07:42
one of those "recidivism risk" algorithms,
160
462779
2024
「累犯風險」演算法之一,
07:44
as they're called,
161
464827
1163
他們是這麼稱呼它的,
07:46
being used in Florida during sentencing by judges.
162
466014
3194
演算法被用在佛羅里達, 法官在判刑時使用。
07:50
Bernard, on the left, the black man, was scored a 10 out of 10.
163
470231
3585
左邊的黑人是伯納, 總分十分,他得了十分。
07:54
Dylan, on the right, 3 out of 10.
164
474999
2007
右邊的狄倫,十分只得了三分。
07:57
10 out of 10, high risk. 3 out of 10, low risk.
165
477030
2501
十分就得十分,高風險。 十分只得三分,低風險。
08:00
They were both brought in for drug possession.
166
480418
2385
他們都因為持有藥品而被逮捕。
08:02
They both had records,
167
482827
1154
他們都有犯罪記錄,
08:04
but Dylan had a felony
168
484005
2806
但狄倫犯過重罪,
08:06
but Bernard didn't.
169
486835
1176
伯納則沒有。
08:09
This matters, because the higher score you are,
170
489638
3066
這很重要,因為你的得分越高,
08:12
the more likely you're being given a longer sentence.
171
492728
3473
你就越可能被判比較長的徒刑。
08:18
What's going on?
172
498114
1294
發生了什麼事?
08:20
Data laundering.
173
500346
1332
洗資料。
08:22
It's a process by which technologists hide ugly truths
174
502750
4427
它是個流程,即技術專家 用黑箱作業的演算法
08:27
inside black box algorithms
175
507201
1821
來隱藏醜陋的真相,
08:29
and call them objective;
176
509046
1290
還宣稱是客觀的;
08:31
call them meritocratic.
177
511140
1568
是精英領導的。
08:34
When they're secret, important and destructive,
178
514938
2385
我為這些秘密、重要、
又有毀滅性的演算法取了個名字:
08:37
I've coined a term for these algorithms:
179
517347
2487
08:39
"weapons of math destruction."
180
519858
1999
「毀滅性的數學武器」。
08:41
(Laughter)
181
521881
1564
(笑聲)
08:43
(Applause)
182
523469
3054
(掌聲)
08:46
They're everywhere, and it's not a mistake.
183
526547
2354
它們無所不在,且不是個過失。
08:49
These are private companies building private algorithms
184
529515
3723
私人公司建立私人演算法,
08:53
for private ends.
185
533262
1392
來達到私人的目的。
08:55
Even the ones I talked about for teachers and the public police,
186
535034
3214
即使是我剛談到 對老師和警方用的演算法,
08:58
those were built by private companies
187
538272
1869
也是由私人公司建立的,
09:00
and sold to the government institutions.
188
540165
2231
然後再銷售給政府機關。
09:02
They call it their "secret sauce" --
189
542420
1873
他們稱它為「秘方醬料」,
09:04
that's why they can't tell us about it.
190
544317
2128
所以不能跟我們討論它。
09:06
It's also private power.
191
546469
2220
它也是種私人的權力。
09:09
They are profiting for wielding the authority of the inscrutable.
192
549744
4695
他們透過行使別人 無法理解的權威來獲利。
09:16
Now you might think, since all this stuff is private
193
556934
2934
你可能會認為, 所有這些都是私人的,
09:19
and there's competition,
194
559892
1158
且有競爭存在,
09:21
maybe the free market will solve this problem.
195
561074
2306
也許自由市場會解決這個問題。
09:23
It won't.
196
563404
1249
並不會。
09:24
There's a lot of money to be made in unfairness.
197
564677
3120
從不公平中可以賺取很多錢。
09:28
Also, we're not economic rational agents.
198
568947
3369
且,我們不是經濟合法代理人。
09:32
We all are biased.
199
572851
1292
我們都有偏見。
09:34
We're all racist and bigoted in ways that we wish we weren't,
200
574780
3377
我們都是種族主義的、偏執的, 即使我們也希望不要這樣,
09:38
in ways that we don't even know.
201
578181
2019
我們甚至不知道我們是這樣的。
09:41
We know this, though, in aggregate,
202
581172
3081
不過我們確實知道,總的來說,
09:44
because sociologists have consistently demonstrated this
203
584277
3220
因為社會學家不斷地用 他們建立的實驗
09:47
with these experiments they build,
204
587521
1665
來展現出這一點,
09:49
where they send a bunch of applications to jobs out,
205
589210
2568
他們寄出一大堆的工作申請書,
09:51
equally qualified but some have white-sounding names
206
591802
2501
都有同樣的資格, 但有些用白人人名,
09:54
and some have black-sounding names,
207
594327
1706
有些用黑人人名,
09:56
and it's always disappointing, the results -- always.
208
596057
2694
結果總是讓人失望的,總是如此。
09:59
So we are the ones that are biased,
209
599330
1771
所以,我們才是有偏見的人,
10:01
and we are injecting those biases into the algorithms
210
601125
3429
且我們把這些偏見注入演算法中,
10:04
by choosing what data to collect,
211
604578
1812
做法是選擇要收集哪些資料、
10:06
like I chose not to think about ramen noodles --
212
606414
2743
比如我選擇不要考量拉麵,
10:09
I decided it was irrelevant.
213
609181
1625
我決定它不重要。
10:10
But by trusting the data that's actually picking up on past practices
214
610830
5684
但透過相信這些資料 真的能了解過去的做法,
10:16
and by choosing the definition of success,
215
616538
2014
以及透過選擇成功的定義,
10:18
how can we expect the algorithms to emerge unscathed?
216
618576
3983
我們如何能冀望產生的演算法未受損?
10:22
We can't. We have to check them.
217
622583
2356
不能。我們得要檢查這些演算法。
10:25
We have to check them for fairness.
218
625985
1709
我們得要檢查它們是否公平。
10:27
The good news is, we can check them for fairness.
219
627718
2711
好消息是,我們可以 檢查它們是否公平。
10:30
Algorithms can be interrogated,
220
630453
3352
演算法可以被審問,
10:33
and they will tell us the truth every time.
221
633829
2034
且它們每次都會告訴我們真相。
10:35
And we can fix them. We can make them better.
222
635887
2493
我們可以修正它們, 我們可以把它們變更好。
10:38
I call this an algorithmic audit,
223
638404
2375
我稱這個為演算法稽核,
10:40
and I'll walk you through it.
224
640803
1679
我會帶大家來了解它。
10:42
First, data integrity check.
225
642506
2196
首先,檢查資料完整性。
10:45
For the recidivism risk algorithm I talked about,
226
645952
2657
針對我先前說的累犯風險演算法,
10:49
a data integrity check would mean we'd have to come to terms with the fact
227
649402
3573
檢查資料完整性就意味著 我們得接受事實,
10:52
that in the US, whites and blacks smoke pot at the same rate
228
652999
3526
事實是,在美國,白人和黑人 抽大麻的比率是一樣的,
10:56
but blacks are far more likely to be arrested --
229
656549
2485
但黑人被逮捕的機率遠高於白人,
10:59
four or five times more likely, depending on the area.
230
659058
3184
四、五倍高的可能性被捕, 依地區而異。
11:03
What is that bias looking like in other crime categories,
231
663137
2826
在其他犯罪類別中, 那樣的偏見會如何呈現?
11:05
and how do we account for it?
232
665987
1451
我們要如何處理它?
11:07
Second, we should think about the definition of success,
233
667982
3039
第二,我們要想想成功的定義,
11:11
audit that.
234
671045
1381
去稽核它。
11:12
Remember -- with the hiring algorithm? We talked about it.
235
672450
2752
記得我們剛剛談過的僱用演算法嗎?
11:15
Someone who stays for four years and is promoted once?
236
675226
3165
待了四年且升遷至少一次?
11:18
Well, that is a successful employee,
237
678415
1769
那就是個成功員工,
11:20
but it's also an employee that is supported by their culture.
238
680208
3079
但那也是個被其文化所支持的員工。
11:23
That said, also it can be quite biased.
239
683909
1926
儘管如此,它也可能很有偏見。
11:25
We need to separate those two things.
240
685859
2065
我們得把這兩件事分開。
11:27
We should look to the blind orchestra audition
241
687948
2426
我們應該要把交響樂團的盲眼甄選
11:30
as an example.
242
690398
1196
當作參考範例。
11:31
That's where the people auditioning are behind a sheet.
243
691618
2756
他們的做法是讓試演奏的人 在布幕後演奏。
11:34
What I want to think about there
244
694766
1931
我想探討的重點是
11:36
is the people who are listening have decided what's important
245
696721
3417
那些在聽並且決定什麼重要的人,
11:40
and they've decided what's not important,
246
700162
2029
他們也會決定什麼不重要 ,
11:42
and they're not getting distracted by that.
247
702215
2059
他們不會被不重要的部份給分心。
11:44
When the blind orchestra auditions started,
248
704781
2749
當交響樂團開始採用盲眼甄選,
11:47
the number of women in orchestras went up by a factor of five.
249
707554
3444
團內的女性成員數上升五倍。
11:52
Next, we have to consider accuracy.
250
712073
2015
接著,我們要考量正確率。
11:55
This is where the value-added model for teachers would fail immediately.
251
715053
3734
這就是老師的加值模型 立刻會出問題的地方。
11:59
No algorithm is perfect, of course,
252
719398
2162
當然,沒有演算法是完美的,
12:02
so we have to consider the errors of every algorithm.
253
722440
3605
所以我們得要考量 每個演算法的錯誤。
12:06
How often are there errors, and for whom does this model fail?
254
726656
4359
多常會出現錯誤、這個模型 針對哪些人會發生錯誤?
12:11
What is the cost of that failure?
255
731670
1718
發生錯誤的成本多高?
12:14
And finally, we have to consider
256
734254
2207
最後,我們得要考量
12:17
the long-term effects of algorithms,
257
737793
2186
演算法的長期效應,
12:20
the feedback loops that are engendering.
258
740686
2207
也就是產生出來的反饋迴圈。
12:23
That sounds abstract,
259
743406
1236
那聽起來很抽象,
12:24
but imagine if Facebook engineers had considered that
260
744666
2664
但想像一下,如果臉書的工程師
12:28
before they decided to show us only things that our friends had posted.
261
748090
4855
決定只讓我們看到朋友的貼文 之前就先考量那一點。
12:33
I have two more messages, one for the data scientists out there.
262
753581
3234
我還有兩個訊息要傳遞, 其一是給資料科學家的。
12:37
Data scientists: we should not be the arbiters of truth.
263
757270
3409
資料科學家,我們 不應該是真相的仲裁者,
12:41
We should be translators of ethical discussions that happen
264
761340
3783
我們應該是翻譯者,
翻譯大社會中發生的每個道德討論。
12:45
in larger society.
265
765147
1294
12:47
(Applause)
266
767399
2133
(掌聲)
12:49
And the rest of you,
267
769556
1556
至於你們其他人,
12:51
the non-data scientists:
268
771831
1396
不是資料科學家的人:
12:53
this is not a math test.
269
773251
1498
這不是個數學考試。
12:55
This is a political fight.
270
775452
1348
這是場政治鬥爭。
12:58
We need to demand accountability for our algorithmic overlords.
271
778407
3907
我們得要求為演算法的超載負責。
13:03
(Applause)
272
783938
1499
(掌聲)
13:05
The era of blind faith in big data must end.
273
785461
4225
盲目信仰大數據的時代必須要結束。
13:09
Thank you very much.
274
789710
1167
非常謝謝。
13:10
(Applause)
275
790901
5303
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog