請雙擊下方英文字幕播放視頻。
譯者: Marssi Draw
審譯者: William Choi
00:12
In 2007, I became the attorney general
0
12843
2591
2007年時,我成為紐澤西州
00:15
of the state of New Jersey.
1
15434
1725
總檢察長。
00:17
Before that, I'd been a criminal prosecutor,
2
17159
2280
在那之前,我擔任刑事檢察官,
00:19
first in the Manhattan district attorney's office,
3
19439
2681
一開始在曼哈頓區檢察官辦公室,
00:22
and then at the United States Department of Justice.
4
22120
2650
後來在美國司法部。
00:24
But when I became the attorney general,
5
24770
2201
但是當我成為總檢察長後,
00:26
two things happened that changed
the way I see criminal justice.
6
26971
3895
發生了兩件事,
改變我對刑事司法的看法。
00:30
The first is that I asked what I thought
7
30866
2030
第一件事是我提出了一些我認為
00:32
were really basic questions.
8
32896
2186
很基本的問題。
00:35
I wanted to understand who we were arresting,
9
35082
2856
我想了解我們逮捕誰、
00:37
who we were charging,
10
37938
1664
控告誰、
00:39
and who we were putting in our nation's jails
11
39602
2128
以及把誰關進我國的拘留所
00:41
and prisons.
12
41730
1416
和監獄。
00:43
I also wanted to understand
13
43146
1648
我也想了解
00:44
if we were making decisions
14
44794
1329
我們做決定的方式
00:46
in a way that made us safer.
15
46123
2518
是否能讓我們更安全。
00:48
And I couldn't get this information out.
16
48641
3252
我找不出答案。
00:51
It turned out that most big criminal justice agencies
17
51893
3357
結果是多數的大型刑事司法機關,
00:55
like my own
18
55250
1302
就像我工作的地方,
00:56
didn't track the things that matter.
19
56552
2382
沒有追蹤重要的事情。
00:58
So after about a month of being incredibly frustrated,
20
58934
3318
因此歷經大約一個月的強烈挫折感後,
01:02
I walked down into a conference room
21
62252
1971
我走進一間會議室,
01:04
that was filled with detectives
22
64223
1890
裡面擠滿警探
01:06
and stacks and stacks of case files,
23
66113
2782
和堆積如山的案件資料,
01:08
and the detectives were sitting there
24
68895
1176
警探坐在那,
01:10
with yellow legal pads taking notes.
25
70071
2234
手拿黃色標準便條紙在做筆記。
01:12
They were trying to get the information
26
72305
1586
他們試圖得到我在找的資訊,
01:13
I was looking for
27
73891
1218
01:15
by going through case by case
28
75109
2045
逐一檢視
01:17
for the past five years.
29
77154
1898
過去五年的案件。
01:19
And as you can imagine,
30
79052
1653
如你所料,
01:20
when we finally got the results, they weren't good.
31
80705
2643
最後得到的結果不太好。
01:23
It turned out that we were doing
32
83348
1655
結論是我們辦了
01:25
a lot of low-level drug cases
33
85003
2020
很多低階的街頭毒品案件,
01:27
on the streets just around the corner
34
87023
1475
就在附近的街上,
01:28
from our office in Trenton.
35
88498
2268
離我們在翠登的辦公室不遠。
01:30
The second thing that happened
36
90766
1467
第二件發生的事情是
01:32
is that I spent the day in the Camden,
New Jersey police department.
37
92233
3674
我在紐澤西肯頓警局待的那一天。
01:35
Now, at that time, Camden, New Jersey,
38
95907
1887
那時候紐澤西州肯頓
01:37
was the most dangerous city in America.
39
97794
2652
是全美最危險的城市,
01:40
I ran the Camden Police
Department because of that.
40
100446
3827
這是我去肯頓警局的原因。
01:44
I spent the day in the police department,
41
104273
2112
我在警局待一整天,
01:46
and I was taken into a room
with senior police officials,
42
106385
2726
被帶到一間房間,
和資深警官在一起,
01:49
all of whom were working hard
43
109111
1675
他們都很努力
01:50
and trying very hard to reduce crime in Camden.
44
110786
3257
試著降低肯頓的犯罪率。
01:54
And what I saw in that room,
45
114043
1826
當我們討論如何降低犯罪率時,
01:55
as we talked about how to reduce crime,
46
115869
2245
我在那房裡看到
01:58
were a series of officers with a
lot of little yellow sticky notes.
47
118114
3859
一大堆警官拿著
很多黃色的小型便利貼。
02:01
And they would take a yellow sticky
and they would write something on it
48
121973
2846
他們拿著一張黃色便利貼,
在上面寫點東西,
02:04
and they would put it up on a board.
49
124823
1799
把它貼在公布欄上。
02:06
And one of them said,
"We had a robbery two weeks ago.
50
126622
2171
其中一位會說:「兩個星期前有搶案。
02:08
We have no suspects."
51
128793
1711
沒有嫌犯。」
02:10
And another said, "We had a shooting in this neighborhood last week. We have no suspects."
52
130504
5027
另一位說:「上星期這附近
有槍擊案。沒有嫌犯。」
02:15
We weren't using data-driven policing.
53
135531
2583
我們沒有運用任何數據處理治安。
02:18
We were essentially trying to fight crime
54
138114
2042
基本上,我們打算用黃色便利貼
02:20
with yellow Post-it notes.
55
140156
2527
來打擊犯罪。
02:22
Now, both of these things made me realize
56
142683
2135
這兩件事讓我了解
02:24
fundamentally that we were failing.
57
144818
3251
我們徹底失敗了。
02:28
We didn't even know who was
in our criminal justice system,
58
148069
3123
我們甚至不知道誰在
我們的刑事司法體系裡,
02:31
we didn't have any data about
the things that mattered,
59
151192
3235
我們沒有重要資料的數據,
02:34
and we didn't share data or use analytics
60
154427
2568
也沒有共享數據、運用分析
02:36
or tools to help us make better decisions
61
156995
2151
或工具來幫助我們做更好的決定,
02:39
and to reduce crime.
62
159146
2003
並減少犯罪。
02:41
And for the first time, I started to think
63
161149
2224
我第一次思考
02:43
about how we made decisions.
64
163373
1910
我們是如何做決定。
02:45
When I was an assistant D.A.,
65
165283
1397
我擔任地區助理檢察官
02:46
and when I was a federal prosecutor,
66
166680
1870
和聯邦檢察官,
02:48
I looked at the cases in front of me,
67
168550
1746
研究眼前的案件時,
02:50
and I generally made decisions based on my instinct
68
170296
2626
幾乎都是靠直覺
02:52
and my experience.
69
172922
1692
和經驗做決定。
02:54
When I became attorney general,
70
174614
1659
當我成為總檢察長,
02:56
I could look at the system as a whole,
71
176273
1639
能夠全面檢視體制,
02:57
and what surprised me is that I found
72
177912
1818
讓我驚訝的是發現了
02:59
that that was exactly how we were doing it
73
179730
1905
我們就是那樣做,
03:01
across the entire system --
74
181635
2303
整個體制都是如此──
03:03
in police departments, in prosecutors's offices,
75
183938
2401
在警察局、檢察署、
03:06
in courts and in jails.
76
186339
2800
法院和監獄。
03:09
And what I learned very quickly
77
189139
2197
很快我就了解
03:11
is that we weren't doing a good job.
78
191336
3633
我們做得不好,
03:14
So I wanted to do things differently.
79
194969
2016
於是就想用不同的方式做事。
03:16
I wanted to introduce data and analytics
80
196985
2197
我想把數據、邏輯分析
03:19
and rigorous statistical analysis
81
199182
2049
和精密統計分析
03:21
into our work.
82
201231
1400
運用到工作上。
03:22
In short, I wanted to moneyball criminal justice.
83
202631
2970
簡而言之,我想用
魔球的方式處理刑事司法。
03:25
Now, moneyball, as many of you know,
84
205601
2027
如在座許多人所知,
03:27
is what the Oakland A's did,
85
207628
1569
魔球是奧克蘭運動家隊
所運用的策略,
03:29
where they used smart data and statistics
86
209197
1973
他們用數據和統計
03:31
to figure out how to pick players
87
211170
1622
找出選擇球員的方法
03:32
that would help them win games,
88
212792
1521
去幫助球隊贏球,
03:34
and they went from a system that
was based on baseball scouts
89
214313
2980
他們從前根據棒球球探意見,
03:37
who used to go out and watch players
90
217293
1860
球探會出門去看球員,
03:39
and use their instinct and experience,
91
219153
1637
然後以直覺和經驗,
03:40
the scouts' instincts and experience,
92
220790
1743
球探的直覺和經驗,
03:42
to pick players, from one to use
93
222533
1713
去挑選球員,從運用
03:44
smart data and rigorous statistical analysis
94
224246
2822
數據和精密統計分析
03:47
to figure out how to pick players
that would help them win games.
95
227068
3371
找出要怎麼選出
能讓他們贏得比賽的球員。
03:50
It worked for the Oakland A's,
96
230439
1798
對奧克蘭運動家隊奏效了,
03:52
and it worked in the state of New Jersey.
97
232237
2219
對紐澤西州也奏效了。
03:54
We took Camden off the top of the list
98
234456
2073
我們讓肯頓不再名列
03:56
as the most dangerous city in America.
99
236529
2171
美國最危險城市之一。
03:58
We reduced murders there by 41 percent,
100
238700
3155
我們把當地兇殺案減少了 41%,
04:01
which actually means 37 lives were saved.
101
241855
2982
意謂著救了 37 條人命。
04:04
And we reduced all crime in the city by 26 percent.
102
244837
3740
我們將城裡各種犯罪活動減少了 26% 。
04:08
We also changed the way
we did criminal prosecutions.
103
248577
3239
我們也改變刑事訴訟的方式,
04:11
So we went from doing low-level drug crimes
104
251816
2005
從處理低階的毒品犯罪,
04:13
that were outside our building
105
253821
1642
那些發生在我們的大樓外,
04:15
to doing cases of statewide importance,
106
255463
2342
轉變為遍及全州的重要案件,
04:17
on things like reducing violence
with the most violent offenders,
107
257805
3158
像是減少高度危險暴力犯的再犯率、
04:20
prosecuting street gangs,
108
260963
1858
起訴街頭幫派、
04:22
gun and drug trafficking, and political corruption.
109
262821
3408
槍枝和毒品運送,以及政治貪汙。
04:26
And all of this matters greatly,
110
266229
2502
這一切帶來的影響甚大,
04:28
because public safety to me
111
268731
1945
因為公共安全對我來說
04:30
is the most important function of government.
112
270676
2536
是政府最重要的功能。
04:33
If we're not safe, we can't be educated,
113
273212
2298
如果我們不安全,
我們就無法接受教育,
04:35
we can't be healthy,
114
275510
1348
就無法擁有健康,
04:36
we can't do any of the other things
we want to do in our lives.
115
276858
2945
我們就無法做所有生活中想做的事。
04:39
And we live in a country today
116
279803
1701
今天我們居住的國家
04:41
where we face serious criminal justice problems.
117
281504
3134
正面對嚴重的刑事司法問題。
04:44
We have 12 million arrests every single year.
118
284638
3661
我們每年有 1,200 萬件逮捕案。
04:48
The vast majority of those arrests
119
288299
2043
這些逮捕案最大部分的是
04:50
are for low-level crimes, like misdemeanors,
120
290342
3012
低階犯罪,像是輕罪,
04:53
70 to 80 percent.
121
293354
1734
佔 70% 到 80%。
04:55
Less than five percent of all arrests
122
295088
1991
只有不到 5% 的逮捕
04:57
are for violent crime.
123
297079
1895
是暴力犯罪。
04:58
Yet we spend 75 billion,
124
298974
2055
然而我們花費 750 億美元,
05:01
that's b for billion,
125
301029
1418
是「百億」元,
05:02
dollars a year on state and local corrections costs.
126
302447
4127
在州和地方一年的懲治支出上。
05:06
Right now, today, we have 2.3 million people
127
306574
2841
現在,我們有 230 萬人
05:09
in our jails and prisons.
128
309415
1900
在監獄和拘留所。
05:11
And we face unbelievable public safety challenges
129
311315
2796
我們面對難以致信的公安問題
05:14
because we have a situation
130
314111
1939
因為面對的處境是
05:16
in which two thirds of the people in our jails
131
316050
2898
拘留所內三分之二的嫌疑犯
05:18
are there waiting for trial.
132
318948
1754
正在等著審判,
05:20
They haven't yet been convicted of a crime.
133
320702
2135
他們還沒被判有罪,
05:22
They're just waiting for their day in court.
134
322837
2119
等著上法庭的那一天。
05:24
And 67 percent of people come back.
135
324956
3548
67% 的嫌疑犯會回來。
05:28
Our recidivism rate is amongst
the highest in the world.
136
328504
3028
我們是全球累犯率最高的國家之一。
05:31
Almost seven in 10 people who are released
137
331532
2103
在監被釋放的人之中,
幾乎 10 個就有 7 個
05:33
from prison will be rearrested
138
333635
1651
會再次被逮捕,
05:35
in a constant cycle of crime and incarceration.
139
335286
3955
呈現不斷犯罪和監禁的循環。
05:39
So when I started my job at the Arnold Foundation,
140
339241
2582
因此當我開始在阿諾德基金會工作,
05:41
I came back to looking at a lot of these questions,
141
341823
2736
回頭來看這一大堆問題,
05:44
and I came back to thinking about how
142
344559
1654
回頭來思考
05:46
we had used data and analytics to transform
143
346213
2383
該如何使用數據和邏輯分析來轉變
05:48
the way we did criminal justice in New Jersey.
144
348596
2584
我們在紐澤西刑事司法採取的方式。
05:51
And when I look at the criminal justice system
145
351180
2144
當我檢視現今
05:53
in the United States today,
146
353324
1656
美國的刑事司法體制時,
05:54
I feel the exact same way that I did
147
354980
1639
我發現和當年在紐澤西起頭時
05:56
about the state of New Jersey when I started there,
148
356619
2466
相同的情況,
05:59
which is that we absolutely have to do better,
149
359085
3228
毫無疑問我們在那做得更好了,
06:02
and I know that we can do better.
150
362313
1923
我也知道我們可以做得更好。
06:04
So I decided to focus
151
364236
1705
因此我決定著眼在
06:05
on using data and analytics
152
365941
2217
使用數據和邏輯分析,
06:08
to help make the most critical decision
153
368158
2361
協助我們在公共安全中
06:10
in public safety,
154
370519
1606
做最重要的決定,
06:12
and that decision is the determination
155
372125
2021
而那個決定即是
06:14
of whether, when someone has been arrested,
156
374146
2535
在某疑犯被逮捕時的判定,
06:16
whether they pose a risk to public safety
157
376681
1915
不管是他們危及公共安全
06:18
and should be detained,
158
378596
1526
該被拘留,
06:20
or whether they don't pose a risk to public safety
159
380122
2356
又或是他們沒有危及公共安全
06:22
and should be released.
160
382478
1637
而該被釋放。
06:24
Everything that happens in criminal cases
161
384115
1919
每件在刑事案件中發生的事
06:26
comes out of this one decision.
162
386034
1772
都來自於這個決定。
06:27
It impacts everything.
163
387806
1496
這個決定影響每一件事,
06:29
It impacts sentencing.
164
389302
1350
影響每一個判決,
06:30
It impacts whether someone gets drug treatment.
165
390652
1901
影響某疑犯是否接受藥物治療,
06:32
It impacts crime and violence.
166
392553
2323
影響暴力和犯罪。
06:34
And when I talk to judges around the United States,
167
394876
1937
當我和全美法官談話時,
06:36
which I do all the time now,
168
396813
1928
── 我現在常這麼做 ──
06:38
they all say the same thing,
169
398741
1837
他們都說一樣的話,
06:40
which is that we put dangerous people in jail,
170
400578
3107
那就是我們把危險人物關進牢裡,
06:43
and we let non-dangerous, nonviolent people out.
171
403685
3525
讓不危險、非暴力的人出來。
06:47
They mean it and they believe it.
172
407210
2233
他們很認真,也深信不疑。
06:49
But when you start to look at the data,
173
409443
1733
但當你開始檢視數據,
06:51
which, by the way, the judges don't have,
174
411176
2464
附帶一提的是,
法官沒有看過數據,
06:53
when we start to look at the data,
175
413640
1612
當我們開始檢視數據,
06:55
what we find time and time again,
176
415252
2418
就會一次又一次地發現
06:57
is that this isn't the case.
177
417670
1982
根本不是如此。
06:59
We find low-risk offenders,
178
419652
1681
我們見到低風險犯人
07:01
which makes up 50 percent of our
entire criminal justice population,
179
421333
3714
佔了所有刑事司法總人數的一半,
07:05
we find that they're in jail.
180
425047
2399
我們發現他們在坐牢。
07:07
Take Leslie Chew, who was a Texas man
181
427446
2486
看看一個名為萊斯理的德州人,
07:09
who stole four blankets on a cold winter night.
182
429932
2884
他在一個寒冷冬夜偷了四件毛毯。
07:12
He was arrested, and he was kept in jail
183
432816
2595
他被逮捕,關在牢裡,
07:15
on 3,500 dollars bail,
184
435411
2053
保釋金為 3,500 美元,
07:17
an amount that he could not afford to pay.
185
437464
2776
他繳不出保釋金,
07:20
And he stayed in jail for eight months
186
440240
2588
因此留在牢裡八個月,
07:22
until his case came up for trial,
187
442828
2065
直到案子開審,
07:24
at a cost to taxpayers of more than 9,000 dollars.
188
444893
3905
花費納稅人超過 9,000 美元。
07:28
And at the other end of the spectrum,
189
448798
1997
而在相反的那一端,
07:30
we're doing an equally terrible job.
190
450795
2282
我們做得一樣很糟。
07:33
The people who we find
191
453077
1572
我們見到的是
07:34
are the highest-risk offenders,
192
454649
2019
高風險的犯人,
07:36
the people who we think have the highest likelihood
193
456668
2497
我們認為這些人若被釋放,
07:39
of committing a new crime if they're released,
194
459165
1952
將會極有可能再次犯罪,
07:41
we see nationally that 50 percent of those people
195
461117
2950
這些人全國大概有一半
07:44
are being released.
196
464067
1974
都被釋放了,
07:46
The reason for this is the way we make decisions.
197
466041
3174
原因出自於我們做決定的方式。
07:49
Judges have the best intentions
198
469215
1709
當法官要做出關於風險的這些決定時,
07:50
when they make these decisions about risk,
199
470924
1952
他們是出於好意,
07:52
but they're making them subjectively.
200
472876
2484
但是卻主觀地做出決定,
07:55
They're like the baseball scouts 20 years ago
201
475360
2146
就像 20 年前的棒球球探,
07:57
who were using their instinct and their experience
202
477506
2131
用直覺和經驗
07:59
to try to decide what risk someone poses.
203
479637
2679
嘗試裁定某人會造成什麼危險。
08:02
They're being subjective,
204
482316
1530
他們主觀意識強,
08:03
and we know what happens
with subjective decision making,
205
483846
3060
而我們知道主觀的決定
會帶來什麼結果,
08:06
which is that we are often wrong.
206
486906
2743
那就是我們經常會做錯。
08:09
What we need in this space
207
489649
1383
我們在這裡需要的是
08:11
are strong data and analytics.
208
491032
2552
強而有力的數據和邏輯分析。
08:13
What I decided to look for
209
493584
1747
我決定找出
08:15
was a strong data and analytic risk assessment tool,
210
495331
2836
強而有力的數據
和邏輯分析風險評估工具,
08:18
something that would let judges actually understand
211
498167
2764
透過科學和客觀的方式
08:20
with a scientific and objective way
212
500931
2259
讓法官確實了解
08:23
what the risk was that was posed
213
503190
1647
他們面前的人
08:24
by someone in front of them.
214
504837
1610
可能造成什麼風險。
08:26
I looked all over the country,
215
506447
1649
我檢視整個國家,
08:28
and I found that between five and 10 percent
216
508096
1942
發現美國的所有司法轄區之中
08:30
of all U.S. jurisdictions
217
510038
1329
約有 5% 到 10%
08:31
actually use any type of risk assessment tool,
218
511367
2978
確實使用某種型式的風險評估工具,
08:34
and when I looked at these tools,
219
514345
1625
當我研究這些工具,
08:35
I quickly realized why.
220
515970
1860
很快就理解事出何因。
08:37
They were unbelievably expensive to administer,
221
517830
2690
這些工具管理起來貴得嚇人,
08:40
they were time-consuming,
222
520520
1528
耗費時間,
08:42
they were limited to the local jurisdiction
223
522048
2107
而且只能限制在
08:44
in which they'd been created.
224
524155
1430
他們所在的司法轄區。
08:45
So basically, they couldn't be scaled
225
525585
1793
基本上,他們無法擴大規模
08:47
or transferred to other places.
226
527378
2209
或是移轉到其它地方。
08:49
So I went out and built a phenomenal team
227
529587
2237
因此我建立了一個出色的團隊,
08:51
of data scientists and researchers
228
531824
2044
由數據科學家、研究人員
08:53
and statisticians
229
533868
1626
和統計學家組成,
08:55
to build a universal risk assessment tool,
230
535494
2845
來建立全面的風險評估工具,
08:58
so that every single judge in
the United States of America
231
538339
2393
如此一來每一位美國法官
09:00
can have an objective, scientific measure of risk.
232
540732
4324
都可以擁有客觀、科學的風險測量。
09:05
In the tool that we've built,
233
545056
1658
在我們設計的工具中,
09:06
what we did was we collected 1.5 million cases
234
546714
2868
我們搜集了
09:09
from all around the United States,
235
549582
1698
全美 150 萬個案件,
09:11
from cities, from counties,
236
551280
1644
它們來自城市、郡市、
09:12
from every single state in the country,
237
552924
1511
來自全國每一個州、
09:14
the federal districts.
238
554435
1746
來自聯邦地區。
09:16
And with those 1.5 million cases,
239
556181
2189
而這 150 萬個案件
09:18
which is the largest data set on pretrial
240
558370
1940
是美國現今審判前
09:20
in the United States today,
241
560310
1805
最大的資料組,
09:22
we were able to basically find that there were
242
562115
1865
基本上我們能找出
09:23
900-plus risk factors that we could look at
243
563980
3322
900 個以上的危險因子,
我們可以從其中檢視,
09:27
to try to figure out what mattered most.
244
567302
2866
嘗試找出什麼是最重要的。
09:30
And we found that there were nine specific things
245
570168
2081
我們發現有特定的九件事
09:32
that mattered all across the country
246
572249
2235
在全國各地都很重要,
09:34
and that were the most highly predictive of risk.
247
574484
2977
而那些是最容易看得出來的風險。
09:37
And so we built a universal risk assessment tool.
248
577461
3705
我們建置出一套全面的風險評估工具,
09:41
And it looks like this.
249
581166
1445
看起來就像這樣,
09:42
As you'll see, we put some information in,
250
582611
2612
就像你看到的,我們會放入一些資訊,
09:45
but most of it is incredibly simple,
251
585223
2013
但大部分都是很簡單的東西,
09:47
it's easy to use,
252
587236
1432
操作也簡單,
09:48
it focuses on things like the
defendant's prior convictions,
253
588668
2969
像是著眼在被告之前的犯罪記錄,
09:51
whether they've been sentenced to incarceration,
254
591637
1979
不管是否被判監禁,
09:53
whether they've engaged in violence before,
255
593616
2264
不管是否曾涉入暴力案件,
09:55
whether they've even failed to come back to court.
256
595880
2393
或只是未曾出庭。
09:58
And with this tool, we can predict three things.
257
598273
2500
有了這個工具,我們可以預測三件事。
10:00
First, whether or not someone will commit
258
600773
1853
首先,如果某疑犯被釋放的話,
10:02
a new crime if they're released.
259
602626
1565
他會否再犯罪。
10:04
Second, for the first time,
260
604191
1664
第二,這是第一次
10:05
and I think this is incredibly important,
261
605855
1861
── 我想這十分重要 ──
10:07
we can predict whether someone will commit
262
607716
1923
我們可以預測某疑犯如果被釋放,
10:09
an act of violence if they're released.
263
609639
1834
會不會從事暴力犯罪。
10:11
And that's the single most important thing
264
611473
1887
那是法官跟你說話時,
10:13
that judges say when you talk to them.
265
613360
1807
對他而言最重要的事。
10:15
And third, we can predict whether someone
266
615167
1828
第三,我們可以預測
10:16
will come back to court.
267
616995
1990
某疑犯會否回到法庭上。
10:18
And every single judge in the
United States of America can use it,
268
618985
3033
每一位美國法官都能使用這個工具,
10:22
because it's been created on a universal data set.
269
622018
3812
因為它是以全面性的資料組建立。
10:25
What judges see if they run the risk assessment tool
270
625830
2609
當法官們操作這個風險評估工具時,
10:28
is this -- it's a dashboard.
271
628439
2120
看到的就是這個介面。
10:30
At the top, you see the New Criminal Activity Score,
272
630559
2848
在最上面的是新犯罪活動評分,
10:33
six of course being the highest,
273
633407
1929
六分當然是最高分,
10:35
and then in the middle you
see, "Elevated risk of violence."
274
635336
2403
接著在中間可以看到
「增加的暴力風險」。
10:37
What that says is that this person
275
637739
1746
意謂著這個疑犯
10:39
is someone who has an elevated risk of violence
276
639485
2060
有較高機率的暴力風險,
10:41
that the judge should look twice at.
277
641545
1885
法官應該再多看一眼。
10:43
And then, towards the bottom,
278
643430
1336
接著,往底部看,
10:44
you see the Failure to Appear Score,
279
644766
1968
你會看到「未能出庭指數」,
10:46
which again is the likelihood
280
646734
1392
這再次意謂著
10:48
that someone will come back to court.
281
648126
3013
某疑犯會回到法院的可能性。
10:51
Now I want to say something really important.
282
651139
2213
接下來我要說的十分重要。
10:53
It's not that I think we should be eliminating
283
653352
2727
我認為我們並不是應該排除
10:56
the judge's instinct and experience
284
656079
2244
法官在這個過程中的
10:58
from this process.
285
658323
1604
直覺和經驗。
10:59
I don't.
286
659927
1058
不是這個意思。
11:00
I actually believe the problem that we see
287
660985
2007
我確實相信我們看到的問題
11:02
and the reason that we have
these incredible system errors,
288
662992
2854
以及造成這些體制裡
重大錯誤的原因,
11:05
where we're incarcerating
low-level, nonviolent people
289
665846
3087
我們監禁低階、非暴力的人,
11:08
and we're releasing high-risk, dangerous people,
290
668933
3172
卻把高風險的危險人物放出來,
11:12
is that we don't have an objective measure of risk.
291
672105
2723
是因為我們沒有客觀的風險評估。
11:14
But what I believe should happen
292
674828
1300
但我相信我們應該
11:16
is that we should take that
data-driven risk assessment
293
676128
2800
拿這份依照數據產生的風險評估,
11:18
and combine that with the
judge's instinct and experience
294
678928
3041
結合法官的直覺和經驗,
11:21
to lead us to better decision making.
295
681969
2958
讓我們做出更好的決定。
11:24
The tool went statewide in Kentucky on July 1,
296
684927
3303
這項工具七月一日開始
在肯塔基州全州使用,
11:28
and we're about to go up in a
number of other U.S. jurisdictions.
297
688230
3351
我們還要擴展到全美許多轄區。
11:31
Our goal, quite simply, is that every single judge
298
691581
2591
我們的目標很簡單,
就是讓每一個美國法官
11:34
in the United States will use a data-driven risk tool
299
694172
2192
在五年內都可運用這套
11:36
within the next five years.
300
696364
2091
以數據為導向的風險工具。
11:38
We're now working on risk tools
301
698455
1352
我們現在設計
11:39
for prosecutors and for police officers as well,
302
699807
3284
檢察官和警官也能使用這個風險工具,
11:43
to try to take a system that runs today
303
703091
2700
試著讓這套系統在現今美國運作,
11:45
in America the same way it did 50 years ago,
304
705791
2796
就像 50 年前的方式一樣,
11:48
based on instinct and experience,
305
708587
2097
根據直覺和經驗,
11:50
and make it into one that runs
306
710684
1855
讓它改變為根據
11:52
on data and analytics.
307
712539
2469
數據和邏輯分析。
11:55
Now, the great news about all this,
308
715008
1921
現在這一切最棒的是
11:56
and we have a ton of work left to do,
309
716929
1617
我們有一大堆工作等著我們去做,
11:58
and we have a lot of culture to change,
310
718546
1857
有很多文化要改變,
12:00
but the great news about all of it
311
720403
1746
但這一切最棒的是
12:02
is that we know it works.
312
722149
1868
我們知道那有用。
12:04
It's why Google is Google,
313
724017
2153
這是 Google 之所以
是 Google 的原因,
12:06
and it's why all these baseball teams use moneyball
314
726170
2462
這就是為什麼所有這些棒球隊運用
12:08
to win games.
315
728632
1781
魔球策略來贏球。
12:10
The great news for us as well
316
730413
1737
同樣對我們來說很棒的是
12:12
is that it's the way that we can transform
317
732150
1896
這是我們能夠改變
12:14
the American criminal justice system.
318
734046
2321
美國刑事司法體系的方式,
12:16
It's how we can make our streets safer,
319
736367
2357
這是我們可以讓街道更安全的方式,
12:18
we can reduce our prison costs,
320
738724
2299
我們可以減少監獄支出,
12:21
and we can make our system much fairer
321
741023
2067
我們可以讓體制更公平
12:23
and more just.
322
743090
1725
且更正義。
12:24
Some people call it data science.
323
744815
2162
有些人稱它為數據科學。
12:26
I call it moneyballing criminal justice.
324
746977
2301
我稱它為魔球的刑事司法。
12:29
Thank you.
325
749278
1804
謝謝大家。
12:31
(Applause)
326
751082
4093
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。