請雙擊下方英文字幕播放視頻。
譯者: NAN-KUN WU
審譯者: Xiaowei Dong
00:12
Whether you like it or not,
0
12380
1336
不論你喜不喜歡
00:13
radical transparency and algorithmic
decision-making is coming at you fast,
1
13740
5376
徹底的透明化
與演算決策法都來勢洶洶
00:19
and it's going to change your life.
2
19140
1976
它們也將改變你的生活
00:21
That's because it's now easy
to take algorithms
3
21140
2816
但這是因為把演算法
放進電腦中很容易
00:23
and embed them into computers
4
23980
1896
00:25
and gather all that data
that you're leaving on yourself
5
25900
2936
然後收集你身上各式各樣的資料
00:28
all over the place,
6
28860
1376
00:30
and know what you're like,
7
30260
1696
來了解你是個什麼樣的人
00:31
and then direct the computers
to interact with you
8
31980
2936
然後讓電腦以比任何人
都好的方式跟你互動
00:34
in ways that are better
than most people can.
9
34940
2120
00:37
Well, that might sound scary.
10
37980
1616
嗯,這或許聽起來有點嚇人
00:39
I've been doing this for a long time
and I have found it to be wonderful.
11
39620
3640
我已經做這件事很久了
而且我發現這是件很棒的事
00:43
My objective has been
to have meaningful work
12
43979
2657
我的任務是做有意義的事情
00:46
and meaningful relationships
with the people I work with,
13
46660
2856
以及和我的同事們建立良好的關係
00:49
and I've learned that I couldn't have that
14
49540
2056
而我發現如果少了
透明化與演算決策法
00:51
unless I had that radical transparency
and that algorithmic decision-making.
15
51620
4280
我將做不到上面提到的兩件事
00:56
I want to show you why that is,
16
56500
2016
我想告訴你們這是為什麼
00:58
I want to show you how it works.
17
58540
1696
我也想告訴你們這是如何運作的
01:00
And I warn you that some of the things
that I'm going to show you
18
60260
3096
我要先警告你們我將要告訴你們的事
或許會有些驚人
01:03
probably are a little bit shocking.
19
63380
1667
01:05
Since I was a kid,
I've had a terrible rote memory.
20
65580
3480
從我小的時候,就很不擅長死記硬背
01:09
And I didn't like following instructions,
21
69940
2176
我也不喜歡照著指示走
01:12
I was no good at following instructions.
22
72140
2416
也不擅長照指示做事
01:14
But I loved to figure out
how things worked for myself.
23
74580
3000
但我喜歡自己去發現
事物是如何運作的
01:18
When I was 12,
24
78500
1376
我 12 歲的時候
01:19
I hated school but I fell in love
with trading the markets.
25
79900
3280
我討厭上學但我熱愛市場交易
01:23
I caddied at the time,
26
83740
1656
我那時候是個桿弟
01:25
earned about five dollars a bag.
27
85420
1576
提一個袋子賺五塊美金
01:27
And I took my caddying money,
and I put it in the stock market.
28
87020
3200
然後我把當桿弟賺的錢拿去買股票
01:31
And that was just because
the stock market was hot at the time.
29
91060
3376
就只是因為那時候流行買股票
01:34
And the first company I bought
30
94460
1456
我投資的第一家公司
01:35
was a company by the name
of Northeast Airlines.
31
95940
2600
是一家名叫東北航空的公司
01:39
Northeast Airlines was
the only company I heard of
32
99180
2736
東北航空是那時候我所知道
01:41
that was selling for less
than five dollars a share.
33
101940
2696
唯一一家股票
每股低於五塊美金的公司
01:44
(Laughter)
34
104660
1976
(笑聲)
01:46
And I figured I could buy more shares,
35
106660
1856
我發現我可以買進更多股票
01:48
and if it went up, I'd make more money.
36
108540
2096
如果股票漲價,我就可以賺更多錢
01:50
So, it was a dumb strategy, right?
37
110660
2840
所以,這個策略很蠢,對吧?
01:54
But I tripled my money,
38
114180
1456
但我的錢翻了三倍
01:55
and I tripled my money
because I got lucky.
39
115660
2120
我的錢翻了三倍,就因為我運氣好
01:58
The company was about to go bankrupt,
40
118340
1816
這家公司瀕臨破產
02:00
but some other company acquired it,
41
120180
2096
但其它公司把它買了下來
02:02
and I tripled my money.
42
122300
1456
然後我賺了三倍回來
02:03
And I was hooked.
43
123780
1200
我上癮了
02:05
And I thought, "This game is easy."
44
125540
2280
我心想:這個遊戲真容易
02:09
With time,
45
129020
1216
過了一段時間之後
02:10
I learned this game is anything but easy.
46
130260
1960
我發現這個遊戲其實相當困難
02:12
In order to be an effective investor,
47
132700
2136
為了成為一名有效率的投資人
02:14
one has to bet against the consensus
48
134860
2896
不能隨波逐流
02:17
and be right.
49
137780
1256
而且要眼光準確
02:19
And it's not easy to bet
against the consensus and be right.
50
139060
2856
不隨波逐流且眼光準確並不容易
02:21
One has to bet against
the consensus and be right
51
141940
2336
你必須不隨波逐流且眼光準確
02:24
because the consensus
is built into the price.
52
144300
2640
因為社會大眾的看法
會反映在股價上面
02:27
And in order to be an entrepreneur,
53
147940
2456
為了成為一名企業家
02:30
a successful entrepreneur,
54
150420
1616
而且是一名成功的企業家
02:32
one has to bet against
the consensus and be right.
55
152060
3480
就必須不隨波逐流且眼光準確
02:37
I had to be an entrepreneur
and an investor --
56
157220
2936
為了同時成為企業家以及投資人
02:40
and what goes along with that
is making a lot of painful mistakes.
57
160180
4200
在這條路上我犯了許多嚴重的錯誤
02:45
So I made a lot of painful mistakes,
58
165260
2816
我就是這麼過來的
02:48
and with time,
59
168100
1256
隨著時間過去
02:49
my attitude about those mistakes
began to change.
60
169380
2960
我對於這些錯誤的態度開始有了改變
02:52
I began to think of them as puzzles.
61
172980
2096
我開始把它們看作是一塊塊的拼圖
02:55
That if I could solve the puzzles,
62
175100
1936
如果我能夠把它們拼在一起的話
02:57
they would give me gems.
63
177060
1440
我就能得到寶石
02:58
And the puzzles were:
64
178980
1656
這些拼圖就是:
03:00
What would I do differently in the future
so I wouldn't make that painful mistake?
65
180660
3880
未來我要怎麼做
才能避免再犯這些錯誤?
03:05
And the gems were principles
66
185100
2576
而那些寶石就是我寫下並牢記的原則
03:07
that I would then write down
so I would remember them
67
187700
3136
03:10
that would help me in the future.
68
190860
1572
這些原則在未來會對我有所幫助
03:12
And because I wrote them down so clearly,
69
192820
2696
就是因為我清楚地把它們寫下來
03:15
I could then --
70
195540
1336
我最後才能發現
03:16
eventually discovered --
71
196900
1576
03:18
I could then embed them into algorithms.
72
198500
3760
我可以把這些原則寫進演算法裡面
03:23
And those algorithms
would be embedded in computers,
73
203220
3456
然後就可以把這些演算法
植入到電腦裡
03:26
and the computers would
make decisions along with me;
74
206700
3336
最後電腦就可以跟我一起做出決定
03:30
and so in parallel,
we would make these decisions.
75
210060
3136
電腦會和我一起做出決定
03:33
And I could see how those decisions
then compared with my own decisions,
76
213220
3976
之後我會去比較我們一起所做的決定
和我自己做的決定
03:37
and I could see that
those decisions were a lot better.
77
217220
3096
然後我發現電腦和我
一起做的決定好多了
03:40
And that was because the computer
could make decisions much faster,
78
220340
4736
這是因為電腦做決定的速度快多了
03:45
it could process a lot more information
79
225100
2256
而且它可以處理更多的資訊
03:47
and it can process decisions much more --
80
227380
3400
它在做決定的時候
也不會被情緒所影響
03:51
less emotionally.
81
231700
1200
03:54
So it radically improved
my decision-making.
82
234580
3920
電腦大大改善了我做決定的過程
04:00
Eight years after I started Bridgewater,
83
240260
4896
在我成立對沖基金橋水聯合
(Bridgewater) 八年之後
04:05
I had my greatest failure,
84
245180
1536
我經歷了我最大的失敗
04:06
my greatest mistake.
85
246740
1200
最大的錯誤
04:09
It was late 1970s,
86
249500
2136
當時是 1970 年代晚期
04:11
I was 34 years old,
87
251660
1976
當年我 34 歲
04:13
and I had calculated that American banks
88
253660
3656
我計算出美國的銀行
借給新興國家太多的錢
04:17
had lent much more money
to emerging countries
89
257340
2856
04:20
than those countries
were going to be able to pay back
90
260220
2816
多到這些國家還不起
04:23
and that we would have
the greatest debt crisis
91
263060
2696
這可能會讓我們遭遇
04:25
since the Great Depression.
92
265780
1360
自從經濟大蕭條以來最大的債務危機
04:28
And with it, an economic crisis
93
268020
2216
伴隨而來的就會是經濟危機
04:30
and a big bear market in stocks.
94
270260
2040
還有嚴重的熊市
04:33
It was a controversial view at the time.
95
273500
2000
這個觀點在當時很具爭議性
04:35
People thought it was
kind of a crazy point of view.
96
275980
2440
大家覺得這是個有點瘋狂的想法
04:39
But in August 1982,
97
279300
2216
但在 1982 年 8 月
04:41
Mexico defaulted on its debt,
98
281540
1960
墨西哥宣布了債務違約
04:44
and a number of other countries followed.
99
284340
2256
然後一些其它的國家也跟進
04:46
And we had the greatest debt crisis
since the Great Depression.
100
286620
3400
我們就這樣遇到了
經濟大蕭條以來最大的債務危機
04:50
And because I had anticipated that,
101
290900
2776
因為我早就已經預料到這件事
04:53
I was asked to testify to Congress
and appear on "Wall Street Week,"
102
293700
4336
我被邀請到國會作見證
以及上當時熱門電視「本周華爾街」
(Wall Street Week) 當來賓
04:58
which was the show of the time.
103
298060
1976
05:00
Just to give you a flavor of that,
I've got a clip here,
104
300060
2936
為了讓你們親身體驗
我準備了一段影片
05:03
and you'll see me in there.
105
303020
1920
我就出現在這影片當中
05:06
(Video) Mr. Chairman, Mr. Mitchell,
106
306300
1696
(影片)主席、米歇爾先生
05:08
it's a great pleasure and a great honor
to be able to appear before you
107
308020
3376
很高興也很榮幸能夠來到這裡
05:11
in examination with what
is going wrong with our economy.
108
311420
3480
檢視我們的經濟出了什麼問題
05:15
The economy is now flat --
109
315460
1936
經濟沒有成長
05:17
teetering on the brink of failure.
110
317420
2136
在崩潰的邊緣搖搖欲墜
05:19
Martin Zweig: You were recently
quoted in an article.
111
319580
2496
馬汀.茲維格:「最近一篇文章提到
你說『我非常肯定我所說過的話,
05:22
You said, "I can say this
with absolute certainty
112
322100
2336
05:24
because I know how markets work."
113
324460
1616
因為我瞭解市場是如何運作的』。」
05:26
Ray Dalio: I can say
with absolute certainty
114
326100
2096
雷.達里歐:
「我非常確定我的論點是對的,
05:28
that if you look at the liquidity base
115
328220
1856
如果把企業的流動資金
和全世界看做一體的話,
05:30
in the corporations
and the world as a whole,
116
330100
3376
05:33
that there's such reduced
level of liquidity
117
333500
2096
目前資金流動性大幅降低,
不可能回到停滯性通膨時代。」
05:35
that you can't return
to an era of stagflation."
118
335620
3216
05:38
I look at that now, I think,
"What an arrogant jerk!"
119
338860
3096
看了這段影片,我只覺得
「這傢伙真是個傲慢的混蛋!」
05:41
(Laughter)
120
341980
2000
(笑聲)
05:45
I was so arrogant, and I was so wrong.
121
345580
2456
我當時非常傲慢,而且錯得離譜
05:48
I mean, while the debt crisis happened,
122
348060
2576
我是說,當債務危機發生的時候
05:50
the stock market and the economy
went up rather than going down,
123
350660
3976
股市還有經濟是成長的,而不是下跌
05:54
and I lost so much money
for myself and for my clients
124
354660
5016
而我和我的客戶損失了一大筆錢
05:59
that I had to shut down
my operation pretty much,
125
359700
3416
我的公司幾乎面臨倒閉
06:03
I had to let almost everybody go.
126
363140
1880
我得讓幾乎所有的員工走人
06:05
And these were like extended family,
127
365460
1736
這些人對我來說就像家人一樣
06:07
I was heartbroken.
128
367220
1616
我痛徹心扉
06:08
And I had lost so much money
129
368860
1816
我的損失大到
06:10
that I had to borrow
4,000 dollars from my dad
130
370700
3336
我得向我父親借 4,000 美金
來付家裡的帳單
06:14
to help to pay my family bills.
131
374060
1920
06:16
It was one of the most painful
experiences of my life ...
132
376660
3160
那是我這輩子最痛苦的經驗之一
06:21
but it turned out to be
one of the greatest experiences of my life
133
381060
3776
但後來變成是
我這輩子最好的經驗之一
06:24
because it changed my attitude
about decision-making.
134
384860
2680
因為那次的經驗改變了
我對決策的態度
06:28
Rather than thinking, "I'm right,"
135
388180
3056
我開始問我自己
「我怎麼確定我是對的?」
06:31
I started to ask myself,
136
391260
1576
而不是認為「我是對的」
06:32
"How do I know I'm right?"
137
392860
1800
06:36
I gained a humility that I needed
138
396300
1936
我學會了謙遜
06:38
in order to balance my audacity.
139
398260
2560
來平衡我的放肆
06:41
I wanted to find the smartest
people who would disagree with me
140
401700
4216
我想要找最聰明的人來反駁我
06:45
to try to understand their perspective
141
405940
1896
讓我了解他們的觀點
06:47
or to have them
stress test my perspective.
142
407860
2600
或讓他們來對我觀點做壓力測試
06:51
I wanted to make an idea meritocracy.
143
411220
2776
我希望優秀想法勝出的功績主義
06:54
In other words,
144
414020
1216
也就是說,不是我說什麼
你做什麼這種獨裁主義
06:55
not an autocracy in which
I would lead and others would follow
145
415260
3816
06:59
and not a democracy in which everybody's
points of view were equally valued,
146
419100
3616
也不是人人的意見都平等的民主
07:02
but I wanted to have an idea meritocracy
in which the best ideas would win out.
147
422740
5096
我想要的是那種
最好的想法會勝出的功績主義
07:07
And in order to do that,
148
427860
1256
為了做到這一點
07:09
I realized that we would need
radical truthfulness
149
429140
3576
我了解到我們需要擁有徹底的誠實
07:12
and radical transparency.
150
432740
1616
還有徹底的透明化
07:14
What I mean by radical truthfulness
and radical transparency
151
434380
3856
我所說的徹底的誠實
還有徹底的透明化
07:18
is people needed to say
what they really believed
152
438260
2656
是讓人們能夠說出他們
真正相信的,並看到一切
07:20
and to see everything.
153
440940
2000
07:23
And we literally
tape almost all conversations
154
443300
3936
我們確實地把幾乎
每個對話都錄了下來
07:27
and let everybody see everything,
155
447260
1616
然後讓所有的人看見
07:28
because if we didn't do that,
156
448900
1416
因為如果我們不這麼做
07:30
we couldn't really have
an idea meritocracy.
157
450340
3080
我們就沒辦法確實做到優秀想法勝出
07:34
In order to have an idea meritocracy,
158
454580
3696
為了達到優秀想法勝出的功績主義
07:38
we have let people speak
and say what they want.
159
458300
2376
我們讓人們說出他們想要什麼
07:40
Just to give you an example,
160
460700
1376
下面是個例子
07:42
this is an email from Jim Haskel --
161
462100
2696
這是吉米寄的一封電子郵件
07:44
somebody who works for me --
162
464820
1376
他是我的員工
07:46
and this was available
to everybody in the company.
163
466220
3376
每個在公司的人都看得到這封郵件
07:49
"Ray, you deserve a 'D-'
164
469620
2536
「雷,你今天在會議裡的
表現只有 60 分,
07:52
for your performance
today in the meeting ...
165
472180
2256
07:54
you did not prepare at all well
166
474460
1696
你根本沒有準備好,
07:56
because there is no way
you could have been that disorganized."
167
476180
3560
因為你不是一個這麼
沒有組織的人。」
08:01
Isn't that great?
168
481340
1216
這不是很棒嗎?
08:02
(Laughter)
169
482580
1216
(笑聲)
08:03
That's great.
170
483820
1216
這真的很棒
08:05
It's great because, first of all,
I needed feedback like that.
171
485060
2936
第一,我需要像這樣的回饋
08:08
I need feedback like that.
172
488020
1616
我需要像這樣的回饋
08:09
And it's great because if I don't let Jim,
and people like Jim,
173
489660
3456
很棒的點在於如果我不讓吉米
還有其他像吉米的人
08:13
to express their points of view,
174
493140
1576
表達他們的觀點
08:14
our relationship wouldn't be the same.
175
494740
2056
我們的關係就不會像現在這樣
08:16
And if I didn't make that public
for everybody to see,
176
496820
3056
如果我不讓所有人都看得見這封信
08:19
we wouldn't have an idea meritocracy.
177
499900
1960
我們就無法實踐
優秀想法勝出的功績主義
08:23
So for that last 25 years
that's how we've been operating.
178
503580
3280
所以過去 25 年來
我們就是這麼經營公司的
08:27
We've been operating
with this radical transparency
179
507460
3056
我們用徹底的透明化來經營公司
08:30
and then collecting these principles,
180
510540
2296
然後收集這些原則
08:32
largely from making mistakes,
181
512860
2056
這些原則大部分是來自犯錯
08:34
and then embedding
those principles into algorithms.
182
514940
4416
然後把這些原則放進演算法裡面
08:39
And then those algorithms provide --
183
519380
2696
然後演算法會做出選擇
08:42
we're following the algorithms
184
522100
2016
我們就根據演算法
還有自己的想法來做出選擇
08:44
in parallel with our thinking.
185
524140
1440
08:47
That has been how we've run
the investment business,
186
527100
3176
這就是我們怎麼讓
這間投資公司運作的
08:50
and it's how we also deal
with the people management.
187
530300
2736
也是我們怎麼管理員工的
08:53
In order to give you a glimmer
into what this looks like,
188
533060
3736
為了讓你們對我在說的東西有點概念
08:56
I'd like to take you into a meeting
189
536820
2336
我想帶你們一起參加一場會議
08:59
and introduce you to a tool of ours
called the "Dot Collector"
190
539180
3136
並向你們介紹我們一個
名叫「集點」的工具
09:02
that helps us do this.
191
542340
1280
這個工具幫助我們做到這些事
09:07
A week after the US election,
192
547460
2176
美國大選的一星期之後
09:09
our research team held a meeting
193
549660
2096
我們的研究團隊舉行了一場會議
09:11
to discuss what a Trump presidency
would mean for the US economy.
194
551780
3320
來討論川普的勝選
對美國經濟有何意義
09:15
Naturally, people had
different opinions on the matter
195
555820
2856
很正常的,人們對這個議題
還有我們怎麼去看這件事情
09:18
and how we were
approaching the discussion.
196
558700
2040
有著不一樣的想法
09:21
The "Dot Collector" collects these views.
197
561660
2776
「集點」收集了這些想法
09:24
It has a list of a few dozen attributes,
198
564460
2296
這些想法總共可以分成好幾十種
09:26
so whenever somebody thinks something
about another person's thinking,
199
566780
4016
所以當有某個人
對別人的看法有想法的時候
09:30
it's easy for them
to convey their assessment;
200
570820
2936
他們就可以很容易地作出評估
09:33
they simply note the attribute
and provide a rating from one to 10.
201
573780
4520
他們會記下這個想法
然後給一個 1 到 10 分的分數
09:39
For example, as the meeting began,
202
579340
2256
舉例來說,在會議開始的時候
09:41
a researcher named Jen rated me a three --
203
581620
3120
一位名叫詹的研究員給了我 3 分
09:45
in other words, badly --
204
585460
2016
就是很爛的意思
09:47
(Laughter)
205
587500
1376
(笑聲)
09:48
for not showing a good balance
of open-mindedness and assertiveness.
206
588900
4160
因為我的論點沒有在包容性
與自信之間取得平衡
09:53
As the meeting transpired,
207
593900
1456
會議公開之後
09:55
Jen's assessments of people
added up like this.
208
595380
3240
詹對於人們的分析總結如下
09:59
Others in the room
have different opinions.
209
599740
2176
在會議室裡的其他人各有不同的意見
10:01
That's normal.
210
601940
1216
這很正常
10:03
Different people are always
going to have different opinions.
211
603180
2920
不同的人總是有著不同的意見
10:06
And who knows who's right?
212
606620
1400
誰知道誰是對的呢?
10:09
Let's look at just what people thought
about how I was doing.
213
609060
3440
我們來看看大家
對我當時的說法是怎麼想的
10:13
Some people thought I did well,
214
613420
2216
有些人認為我說得很好
10:15
others, poorly.
215
615660
1200
其他人覺得我說得很差
10:17
With each of these views,
216
617900
1336
透過這些觀點
10:19
we can explore the thinking
behind the numbers.
217
619260
2320
我們可以探索在數字背後的這些想法
10:22
Here's what Jen and Larry said.
218
622340
2160
詹和賴利是這麼說的
10:25
Note that everyone
gets to express their thinking,
219
625580
2616
無論他們在公司的位階
10:28
including their critical thinking,
220
628220
1656
每個人都可以表達他們的想法
10:29
regardless of their position
in the company.
221
629900
2120
包括他們的批判性思考
10:32
Jen, who's 24 years old
and right out of college,
222
632940
3096
24 歲、剛剛從學校畢業的詹
10:36
can tell me, the CEO,
that I'm approaching things terribly.
223
636060
2840
可以說我這個執行長
觀察事情的能力很差
10:40
This tool helps people
both express their opinions
224
640300
3776
這個工具幫人們表達他們的意見
10:44
and then separate themselves
from their opinions
225
644100
3096
然後讓他們從更高的
一個客觀角度來看事情
10:47
to see things from a higher level.
226
647220
2040
10:50
When Jen and others shift their attentions
from inputting their own opinions
227
650460
4896
當詹和其他人從提供意見
10:55
to looking down on the whole screen,
228
655380
2576
變成綜觀全局的時候
10:57
their perspective changes.
229
657980
1720
他們的觀點就改變了
11:00
They see their own opinions
as just one of many
230
660500
3136
他們了解到他們的想法
只是眾多想法的其中之一
11:03
and naturally start asking themselves,
231
663660
2536
並很自然地開始思考
11:06
"How do I know my opinion is right?"
232
666220
2000
「我怎麼知道我的想法是對的?」
11:09
That shift in perspective is like going
from seeing in one dimension
233
669300
4056
這就像是從一個維度看事情
11:13
to seeing in multiple dimensions.
234
673380
2256
變成從多個維度看事情
11:15
And it shifts the conversation
from arguing over our opinions
235
675660
4096
也把對話從針對這些想法的爭論
11:19
to figuring out objective criteria
for determining which opinions are best.
236
679780
4400
變為找出客觀的標準
來決定哪個意見是最好的
11:24
Behind the "Dot Collector"
is a computer that is watching.
237
684740
3600
在「集點」的背後
有一台在監控的電腦
11:28
It watches what all
these people are thinking
238
688940
2176
他監控了所有人的想法
11:31
and it correlates that
with how they think.
239
691140
2576
然後去和他們是怎麼想的做連結
11:33
And it communicates advice
back to each of them based on that.
240
693740
3520
然後根據這些結果回傳建議
11:38
Then it draws the data
from all the meetings
241
698340
3416
最後電腦會從所有的會議中收集資料
11:41
to create a pointilist painting
of what people are like
242
701780
3216
並產生一個點描清單
顯示他們是怎麼樣的人
以及是怎麼思考的
11:45
and how they think.
243
705020
1240
11:46
And it does that guided by algorithms.
244
706980
2720
電腦做的這些事就是從演算法來的
11:50
Knowing what people are like helps
to match them better with their jobs.
245
710620
3760
瞭解人們是怎麼樣的人
能夠幫助他們找到更適合他們的職位
11:54
For example,
246
714940
1216
舉例來說
11:56
a creative thinker who is unreliable
247
716180
1736
一個有創意但是不可靠的人
11:57
might be matched up with someone
who's reliable but not creative.
248
717940
3080
可以和一個可靠但沒創意的人做搭配
12:02
Knowing what people are like
also allows us to decide
249
722100
3336
瞭解人們是怎麼樣的人
也讓我們可以決定
12:05
what responsibilities to give them
250
725460
2256
要派給他們什麼樣的職責
12:07
and to weigh our decisions
based on people's merits.
251
727740
3480
同時也可以根據大家的表現
來衡量我們的決定
12:11
We call it their believability.
252
731860
1600
我們把它稱為這些人的可信度
12:14
Here's an example of a vote that we took
253
734380
1976
在某次投票裡
12:16
where the majority
of people felt one way ...
254
736380
2840
多數人投給了一邊
12:20
but when we weighed the views
based on people's merits,
255
740740
2936
但當我們對人們的功績作加權之後
12:23
the answer was completely different.
256
743700
1840
得到了完全相反的答案
12:26
This process allows us to make decisions
not based on democracy,
257
746740
4576
這個過程讓我們的決策過程
不再根據民主
12:31
not based on autocracy,
258
751340
2136
也不是根據獨裁
12:33
but based on algorithms that take
people's believability into consideration.
259
753500
5240
而是根據演算法
把人們的可信度納入考量
12:41
Yup, we really do this.
260
761340
1696
沒錯,我們真的這麼做
12:43
(Laughter)
261
763060
3296
(笑聲)
12:46
We do it because it eliminates
262
766380
2856
我們這麼做是因為它可以消除
12:49
what I believe to be
one of the greatest tragedies of mankind,
263
769260
4456
我認為人類最大的悲劇
12:53
and that is people arrogantly,
264
773740
2160
也就是人們把傲慢且天真的
錯誤想法放在心裡
12:56
naïvely holding opinions
in their minds that are wrong,
265
776580
4456
13:01
and acting on them,
266
781060
1256
並根據這些想法做事情
13:02
and not putting them out there
to stress test them.
267
782340
2760
而且不肯開誠布公地接受壓力測試
13:05
And that's a tragedy.
268
785820
1336
這真的是個悲劇
13:07
And we do it because it elevates ourselves
above our own opinions
269
787180
5416
我們這麼做是因為
可以讓我們超脫於自我
13:12
so that we start to see things
through everybody's eyes,
270
792620
2896
然後從每一個人的眼睛看每件事
13:15
and we see things collectively.
271
795540
1920
來看見事情的全貌
13:18
Collective decision-making is so much
better than individual decision-making
272
798180
4336
如果好好地去做的話
集體決策比個人決策要好得太多了
13:22
if it's done well.
273
802540
1200
13:24
It's been the secret sauce
behind our success.
274
804180
2616
這就是我們成功背後的原因
13:26
It's why we've made
more money for our clients
275
806820
2176
也是我們幫客戶賺得
比其它現有對沖基金更多的原因
13:29
than any other hedge fund in existence
276
809020
1936
13:30
and made money
23 out of the last 26 years.
277
810980
2720
而且過去 26 年有 23 年是獲利的
13:35
So what's the problem
with being radically truthful
278
815700
4536
那麼彼此間徹底的誠實
13:40
and radically transparent with each other?
279
820260
2240
和徹底透明會有什麼問題呢?
13:45
People say it's emotionally difficult.
280
825220
2080
有人說是情感上的困難
13:48
Critics say it's a formula
for a brutal work environment.
281
828060
4240
評論家說這是殘酷的職場
13:53
Neuroscientists tell me it has to do
with how are brains are prewired.
282
833220
4856
神經科學家告訴我
這和大腦的網絡有關
13:58
There's a part of our brain
that would like to know our mistakes
283
838100
3216
大腦裡有一個部位會試著去找出
我們犯的錯誤還有弱點
14:01
and like to look at our weaknesses
so we could do better.
284
841340
3960
然後讓我們表現更好
14:05
I'm told that that's
the prefrontal cortex.
285
845940
2440
有人告訴我這個部位叫前額葉皮質區
14:08
And then there's a part of our brain
which views all of this as attacks.
286
848860
4856
大腦還有個部位
會把錯誤和弱點視為攻擊
14:13
I'm told that that's the amygdala.
287
853740
1960
這個部位叫杏仁核
14:16
In other words,
there are two you's inside you:
288
856260
3056
換句話說,你的身體裡面有兩個你
14:19
there's an emotional you
289
859340
1416
一個是情緒化的你
14:20
and there's an intellectual you,
290
860780
1776
一個是理性的你
14:22
and often they're at odds,
291
862580
1776
他們常常是互相衝突的
14:24
and often they work against you.
292
864380
1920
也常常和你做對
14:26
It's been our experience
that we can win this battle.
293
866980
3736
我們的經驗告訴我們
我們能贏得這場競爭
14:30
We win it as a group.
294
870740
1320
透過團結合作
14:32
It takes about 18 months typically
295
872820
2336
通常要花 18 個月
14:35
to find that most people
prefer operating this way,
296
875180
3056
才會讓大家習慣並選擇徹底的透明
14:38
with this radical transparency
297
878260
2016
14:40
than to be operating
in a more opaque environment.
298
880300
3336
而不是個看不清的環境
14:43
There's not politics,
there's not the brutality of --
299
883660
4296
這無關政治,也無關殘酷
14:47
you know, all of that hidden,
behind-the-scenes --
300
887980
2376
在這些的背後
14:50
there's an idea meritocracy
where people can speak up.
301
890380
2936
有個優秀想法勝出的功績主義
讓每個人都能發聲
14:53
And that's been great.
302
893340
1256
這一直都很棒
14:54
It's given us more effective work,
303
894620
1656
它讓我們的工作更有效果
14:56
and it's given us
more effective relationships.
304
896300
2400
也讓人們之間的關係更有效果
14:59
But it's not for everybody.
305
899220
1320
不過這不適用於所有人
15:01
We found something like
25 or 30 percent of the population
306
901500
2936
我們發現有大約
百分之 25 到 30 的人不適用
15:04
it's just not for.
307
904460
1736
15:06
And by the way,
308
906220
1216
附帶一提,我說的徹底透明
不是說什麼都不掩飾
15:07
when I say radical transparency,
309
907460
1816
15:09
I'm not saying transparency
about everything.
310
909300
2336
15:11
I mean, you don't have to tell somebody
that their bald spot is growing
311
911660
3816
我是說,你不必告訴
某人他越來越禿了
15:15
or their baby's ugly.
312
915500
1616
或是他們的小孩很醜
15:17
So, I'm just talking about --
313
917140
2096
我說的是──
15:19
(Laughter)
314
919260
1216
(笑聲)
15:20
talking about the important things.
315
920500
2176
說的是那些比較重要的事
15:22
So --
316
922700
1216
所以──
15:23
(Laughter)
317
923940
3200
(笑聲)
15:28
So when you leave this room,
318
928420
1416
當你離開這個會場的時候
15:29
I'd like you to observe yourself
in conversations with others.
319
929860
4440
我想要你在和別人
對話的時候觀察自己
15:35
Imagine if you knew
what they were really thinking,
320
935180
3680
想像你知道他們真正的想法
15:39
and imagine if you knew
what they were really like ...
321
939580
2600
想像你知道他們其實是個什麼樣的人
15:43
and imagine if they knew
what you were really thinking
322
943660
3976
想像他們知道你真正的想法
15:47
and what were really like.
323
947660
1840
和真正的你
15:49
It would certainly clear things up a lot
324
949980
2576
這一定會讓事情明朗化
15:52
and make your operations
together more effective.
325
952580
2856
並讓你的做法更有效
15:55
I think it will improve
your relationships.
326
955460
2240
我覺得這可以讓你的關係變得更美好
15:58
Now imagine that you can have algorithms
327
958420
3296
現在想像你有演算法
16:01
that will help you gather
all of that information
328
961740
3816
幫你收集所有的資料
16:05
and even help you make decisions
in an idea-meritocratic way.
329
965580
4560
甚至幫你在優秀想法勝出的
功績主義之下做出決定
16:12
This sort of radical transparency
is coming at you
330
972460
4336
這樣的徹底透明化將會進入你的世界
16:16
and it is going to affect your life.
331
976820
1960
而且會影響你的生活
16:19
And in my opinion,
332
979420
2056
我認為
16:21
it's going to be wonderful.
333
981500
1336
這將會是很美好的一件事
16:22
So I hope it is as wonderful for you
334
982860
2336
所以我希望這對你們來說
16:25
as it is for me.
335
985220
1200
會跟對我來說一樣美好
16:26
Thank you very much.
336
986980
1256
非常謝謝你們
16:28
(Applause)
337
988260
4360
(掌聲)
New videos
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。