How to build a company where the best ideas win | Ray Dalio

1,388,663 views ・ 2017-09-06

TED


Please double-click on the English subtitles below to play the video.

00:12
Whether you like it or not,
0
12380
1336
00:13
radical transparency and algorithmic decision-making is coming at you fast,
1
13740
5376
00:19
and it's going to change your life.
2
19140
1976
00:21
That's because it's now easy to take algorithms
3
21140
2816
00:23
and embed them into computers
4
23980
1896
00:25
and gather all that data that you're leaving on yourself
5
25900
2936
00:28
all over the place,
6
28860
1376
00:30
and know what you're like,
7
30260
1696
00:31
and then direct the computers to interact with you
8
31980
2936
00:34
in ways that are better than most people can.
9
34940
2120
00:37
Well, that might sound scary.
10
37980
1616
00:39
I've been doing this for a long time and I have found it to be wonderful.
11
39620
3640
00:43
My objective has been to have meaningful work
12
43979
2657
00:46
and meaningful relationships with the people I work with,
13
46660
2856
00:49
and I've learned that I couldn't have that
14
49540
2056
00:51
unless I had that radical transparency and that algorithmic decision-making.
15
51620
4280
00:56
I want to show you why that is,
16
56500
2016
00:58
I want to show you how it works.
17
58540
1696
01:00
And I warn you that some of the things that I'm going to show you
18
60260
3096
01:03
probably are a little bit shocking.
19
63380
1667
01:05
Since I was a kid, I've had a terrible rote memory.
20
65580
3480
01:09
And I didn't like following instructions,
21
69940
2176
01:12
I was no good at following instructions.
22
72140
2416
01:14
But I loved to figure out how things worked for myself.
23
74580
3000
01:18
When I was 12,
24
78500
1376
01:19
I hated school but I fell in love with trading the markets.
25
79900
3280
01:23
I caddied at the time,
26
83740
1656
01:25
earned about five dollars a bag.
27
85420
1576
01:27
And I took my caddying money, and I put it in the stock market.
28
87020
3200
01:31
And that was just because the stock market was hot at the time.
29
91060
3376
01:34
And the first company I bought
30
94460
1456
01:35
was a company by the name of Northeast Airlines.
31
95940
2600
01:39
Northeast Airlines was the only company I heard of
32
99180
2736
01:41
that was selling for less than five dollars a share.
33
101940
2696
01:44
(Laughter)
34
104660
1976
01:46
And I figured I could buy more shares,
35
106660
1856
01:48
and if it went up, I'd make more money.
36
108540
2096
01:50
So, it was a dumb strategy, right?
37
110660
2840
01:54
But I tripled my money,
38
114180
1456
01:55
and I tripled my money because I got lucky.
39
115660
2120
01:58
The company was about to go bankrupt,
40
118340
1816
02:00
but some other company acquired it,
41
120180
2096
02:02
and I tripled my money.
42
122300
1456
02:03
And I was hooked.
43
123780
1200
02:05
And I thought, "This game is easy."
44
125540
2280
02:09
With time,
45
129020
1216
02:10
I learned this game is anything but easy.
46
130260
1960
02:12
In order to be an effective investor,
47
132700
2136
02:14
one has to bet against the consensus
48
134860
2896
02:17
and be right.
49
137780
1256
02:19
And it's not easy to bet against the consensus and be right.
50
139060
2856
02:21
One has to bet against the consensus and be right
51
141940
2336
02:24
because the consensus is built into the price.
52
144300
2640
02:27
And in order to be an entrepreneur,
53
147940
2456
02:30
a successful entrepreneur,
54
150420
1616
02:32
one has to bet against the consensus and be right.
55
152060
3480
02:37
I had to be an entrepreneur and an investor --
56
157220
2936
02:40
and what goes along with that is making a lot of painful mistakes.
57
160180
4200
02:45
So I made a lot of painful mistakes,
58
165260
2816
02:48
and with time,
59
168100
1256
02:49
my attitude about those mistakes began to change.
60
169380
2960
02:52
I began to think of them as puzzles.
61
172980
2096
02:55
That if I could solve the puzzles,
62
175100
1936
02:57
they would give me gems.
63
177060
1440
02:58
And the puzzles were:
64
178980
1656
03:00
What would I do differently in the future so I wouldn't make that painful mistake?
65
180660
3880
03:05
And the gems were principles
66
185100
2576
03:07
that I would then write down so I would remember them
67
187700
3136
03:10
that would help me in the future.
68
190860
1572
03:12
And because I wrote them down so clearly,
69
192820
2696
03:15
I could then --
70
195540
1336
03:16
eventually discovered --
71
196900
1576
03:18
I could then embed them into algorithms.
72
198500
3760
03:23
And those algorithms would be embedded in computers,
73
203220
3456
03:26
and the computers would make decisions along with me;
74
206700
3336
03:30
and so in parallel, we would make these decisions.
75
210060
3136
03:33
And I could see how those decisions then compared with my own decisions,
76
213220
3976
03:37
and I could see that those decisions were a lot better.
77
217220
3096
03:40
And that was because the computer could make decisions much faster,
78
220340
4736
03:45
it could process a lot more information
79
225100
2256
03:47
and it can process decisions much more --
80
227380
3400
03:51
less emotionally.
81
231700
1200
03:54
So it radically improved my decision-making.
82
234580
3920
04:00
Eight years after I started Bridgewater,
83
240260
4896
04:05
I had my greatest failure,
84
245180
1536
04:06
my greatest mistake.
85
246740
1200
04:09
It was late 1970s,
86
249500
2136
04:11
I was 34 years old,
87
251660
1976
04:13
and I had calculated that American banks
88
253660
3656
04:17
had lent much more money to emerging countries
89
257340
2856
04:20
than those countries were going to be able to pay back
90
260220
2816
04:23
and that we would have the greatest debt crisis
91
263060
2696
04:25
since the Great Depression.
92
265780
1360
04:28
And with it, an economic crisis
93
268020
2216
04:30
and a big bear market in stocks.
94
270260
2040
04:33
It was a controversial view at the time.
95
273500
2000
04:35
People thought it was kind of a crazy point of view.
96
275980
2440
04:39
But in August 1982,
97
279300
2216
04:41
Mexico defaulted on its debt,
98
281540
1960
04:44
and a number of other countries followed.
99
284340
2256
04:46
And we had the greatest debt crisis since the Great Depression.
100
286620
3400
04:50
And because I had anticipated that,
101
290900
2776
04:53
I was asked to testify to Congress and appear on "Wall Street Week,"
102
293700
4336
04:58
which was the show of the time.
103
298060
1976
05:00
Just to give you a flavor of that, I've got a clip here,
104
300060
2936
05:03
and you'll see me in there.
105
303020
1920
05:06
(Video) Mr. Chairman, Mr. Mitchell,
106
306300
1696
05:08
it's a great pleasure and a great honor to be able to appear before you
107
308020
3376
05:11
in examination with what is going wrong with our economy.
108
311420
3480
05:15
The economy is now flat --
109
315460
1936
05:17
teetering on the brink of failure.
110
317420
2136
05:19
Martin Zweig: You were recently quoted in an article.
111
319580
2496
05:22
You said, "I can say this with absolute certainty
112
322100
2336
05:24
because I know how markets work."
113
324460
1616
05:26
Ray Dalio: I can say with absolute certainty
114
326100
2096
05:28
that if you look at the liquidity base
115
328220
1856
05:30
in the corporations and the world as a whole,
116
330100
3376
05:33
that there's such reduced level of liquidity
117
333500
2096
05:35
that you can't return to an era of stagflation."
118
335620
3216
05:38
I look at that now, I think, "What an arrogant jerk!"
119
338860
3096
05:41
(Laughter)
120
341980
2000
05:45
I was so arrogant, and I was so wrong.
121
345580
2456
05:48
I mean, while the debt crisis happened,
122
348060
2576
05:50
the stock market and the economy went up rather than going down,
123
350660
3976
05:54
and I lost so much money for myself and for my clients
124
354660
5016
05:59
that I had to shut down my operation pretty much,
125
359700
3416
06:03
I had to let almost everybody go.
126
363140
1880
06:05
And these were like extended family,
127
365460
1736
06:07
I was heartbroken.
128
367220
1616
06:08
And I had lost so much money
129
368860
1816
06:10
that I had to borrow 4,000 dollars from my dad
130
370700
3336
06:14
to help to pay my family bills.
131
374060
1920
06:16
It was one of the most painful experiences of my life ...
132
376660
3160
06:21
but it turned out to be one of the greatest experiences of my life
133
381060
3776
06:24
because it changed my attitude about decision-making.
134
384860
2680
06:28
Rather than thinking, "I'm right,"
135
388180
3056
06:31
I started to ask myself,
136
391260
1576
06:32
"How do I know I'm right?"
137
392860
1800
06:36
I gained a humility that I needed
138
396300
1936
06:38
in order to balance my audacity.
139
398260
2560
06:41
I wanted to find the smartest people who would disagree with me
140
401700
4216
06:45
to try to understand their perspective
141
405940
1896
06:47
or to have them stress test my perspective.
142
407860
2600
06:51
I wanted to make an idea meritocracy.
143
411220
2776
06:54
In other words,
144
414020
1216
06:55
not an autocracy in which I would lead and others would follow
145
415260
3816
06:59
and not a democracy in which everybody's points of view were equally valued,
146
419100
3616
07:02
but I wanted to have an idea meritocracy in which the best ideas would win out.
147
422740
5096
07:07
And in order to do that,
148
427860
1256
07:09
I realized that we would need radical truthfulness
149
429140
3576
07:12
and radical transparency.
150
432740
1616
07:14
What I mean by radical truthfulness and radical transparency
151
434380
3856
07:18
is people needed to say what they really believed
152
438260
2656
07:20
and to see everything.
153
440940
2000
07:23
And we literally tape almost all conversations
154
443300
3936
07:27
and let everybody see everything,
155
447260
1616
07:28
because if we didn't do that,
156
448900
1416
07:30
we couldn't really have an idea meritocracy.
157
450340
3080
07:34
In order to have an idea meritocracy,
158
454580
3696
07:38
we have let people speak and say what they want.
159
458300
2376
07:40
Just to give you an example,
160
460700
1376
07:42
this is an email from Jim Haskel --
161
462100
2696
07:44
somebody who works for me --
162
464820
1376
07:46
and this was available to everybody in the company.
163
466220
3376
07:49
"Ray, you deserve a 'D-'
164
469620
2536
07:52
for your performance today in the meeting ...
165
472180
2256
07:54
you did not prepare at all well
166
474460
1696
07:56
because there is no way you could have been that disorganized."
167
476180
3560
08:01
Isn't that great?
168
481340
1216
08:02
(Laughter)
169
482580
1216
08:03
That's great.
170
483820
1216
08:05
It's great because, first of all, I needed feedback like that.
171
485060
2936
08:08
I need feedback like that.
172
488020
1616
08:09
And it's great because if I don't let Jim, and people like Jim,
173
489660
3456
08:13
to express their points of view,
174
493140
1576
08:14
our relationship wouldn't be the same.
175
494740
2056
08:16
And if I didn't make that public for everybody to see,
176
496820
3056
08:19
we wouldn't have an idea meritocracy.
177
499900
1960
08:23
So for that last 25 years that's how we've been operating.
178
503580
3280
08:27
We've been operating with this radical transparency
179
507460
3056
08:30
and then collecting these principles,
180
510540
2296
08:32
largely from making mistakes,
181
512860
2056
08:34
and then embedding those principles into algorithms.
182
514940
4416
08:39
And then those algorithms provide --
183
519380
2696
08:42
we're following the algorithms
184
522100
2016
08:44
in parallel with our thinking.
185
524140
1440
08:47
That has been how we've run the investment business,
186
527100
3176
08:50
and it's how we also deal with the people management.
187
530300
2736
08:53
In order to give you a glimmer into what this looks like,
188
533060
3736
08:56
I'd like to take you into a meeting
189
536820
2336
08:59
and introduce you to a tool of ours called the "Dot Collector"
190
539180
3136
09:02
that helps us do this.
191
542340
1280
09:07
A week after the US election,
192
547460
2176
09:09
our research team held a meeting
193
549660
2096
09:11
to discuss what a Trump presidency would mean for the US economy.
194
551780
3320
09:15
Naturally, people had different opinions on the matter
195
555820
2856
09:18
and how we were approaching the discussion.
196
558700
2040
09:21
The "Dot Collector" collects these views.
197
561660
2776
09:24
It has a list of a few dozen attributes,
198
564460
2296
09:26
so whenever somebody thinks something about another person's thinking,
199
566780
4016
09:30
it's easy for them to convey their assessment;
200
570820
2936
09:33
they simply note the attribute and provide a rating from one to 10.
201
573780
4520
09:39
For example, as the meeting began,
202
579340
2256
09:41
a researcher named Jen rated me a three --
203
581620
3120
09:45
in other words, badly --
204
585460
2016
09:47
(Laughter)
205
587500
1376
09:48
for not showing a good balance of open-mindedness and assertiveness.
206
588900
4160
09:53
As the meeting transpired,
207
593900
1456
09:55
Jen's assessments of people added up like this.
208
595380
3240
09:59
Others in the room have different opinions.
209
599740
2176
10:01
That's normal.
210
601940
1216
10:03
Different people are always going to have different opinions.
211
603180
2920
10:06
And who knows who's right?
212
606620
1400
10:09
Let's look at just what people thought about how I was doing.
213
609060
3440
10:13
Some people thought I did well,
214
613420
2216
10:15
others, poorly.
215
615660
1200
10:17
With each of these views,
216
617900
1336
10:19
we can explore the thinking behind the numbers.
217
619260
2320
10:22
Here's what Jen and Larry said.
218
622340
2160
10:25
Note that everyone gets to express their thinking,
219
625580
2616
10:28
including their critical thinking,
220
628220
1656
10:29
regardless of their position in the company.
221
629900
2120
10:32
Jen, who's 24 years old and right out of college,
222
632940
3096
10:36
can tell me, the CEO, that I'm approaching things terribly.
223
636060
2840
10:40
This tool helps people both express their opinions
224
640300
3776
10:44
and then separate themselves from their opinions
225
644100
3096
10:47
to see things from a higher level.
226
647220
2040
10:50
When Jen and others shift their attentions from inputting their own opinions
227
650460
4896
10:55
to looking down on the whole screen,
228
655380
2576
10:57
their perspective changes.
229
657980
1720
11:00
They see their own opinions as just one of many
230
660500
3136
11:03
and naturally start asking themselves,
231
663660
2536
11:06
"How do I know my opinion is right?"
232
666220
2000
11:09
That shift in perspective is like going from seeing in one dimension
233
669300
4056
11:13
to seeing in multiple dimensions.
234
673380
2256
11:15
And it shifts the conversation from arguing over our opinions
235
675660
4096
11:19
to figuring out objective criteria for determining which opinions are best.
236
679780
4400
11:24
Behind the "Dot Collector" is a computer that is watching.
237
684740
3600
11:28
It watches what all these people are thinking
238
688940
2176
11:31
and it correlates that with how they think.
239
691140
2576
11:33
And it communicates advice back to each of them based on that.
240
693740
3520
11:38
Then it draws the data from all the meetings
241
698340
3416
11:41
to create a pointilist painting of what people are like
242
701780
3216
11:45
and how they think.
243
705020
1240
11:46
And it does that guided by algorithms.
244
706980
2720
11:50
Knowing what people are like helps to match them better with their jobs.
245
710620
3760
11:54
For example,
246
714940
1216
11:56
a creative thinker who is unreliable
247
716180
1736
11:57
might be matched up with someone who's reliable but not creative.
248
717940
3080
12:02
Knowing what people are like also allows us to decide
249
722100
3336
12:05
what responsibilities to give them
250
725460
2256
12:07
and to weigh our decisions based on people's merits.
251
727740
3480
12:11
We call it their believability.
252
731860
1600
12:14
Here's an example of a vote that we took
253
734380
1976
12:16
where the majority of people felt one way ...
254
736380
2840
12:20
but when we weighed the views based on people's merits,
255
740740
2936
12:23
the answer was completely different.
256
743700
1840
12:26
This process allows us to make decisions not based on democracy,
257
746740
4576
12:31
not based on autocracy,
258
751340
2136
12:33
but based on algorithms that take people's believability into consideration.
259
753500
5240
12:41
Yup, we really do this.
260
761340
1696
12:43
(Laughter)
261
763060
3296
12:46
We do it because it eliminates
262
766380
2856
12:49
what I believe to be one of the greatest tragedies of mankind,
263
769260
4456
12:53
and that is people arrogantly,
264
773740
2160
12:56
naΓ―vely holding opinions in their minds that are wrong,
265
776580
4456
13:01
and acting on them,
266
781060
1256
13:02
and not putting them out there to stress test them.
267
782340
2760
13:05
And that's a tragedy.
268
785820
1336
13:07
And we do it because it elevates ourselves above our own opinions
269
787180
5416
13:12
so that we start to see things through everybody's eyes,
270
792620
2896
13:15
and we see things collectively.
271
795540
1920
13:18
Collective decision-making is so much better than individual decision-making
272
798180
4336
13:22
if it's done well.
273
802540
1200
13:24
It's been the secret sauce behind our success.
274
804180
2616
13:26
It's why we've made more money for our clients
275
806820
2176
13:29
than any other hedge fund in existence
276
809020
1936
13:30
and made money 23 out of the last 26 years.
277
810980
2720
13:35
So what's the problem with being radically truthful
278
815700
4536
13:40
and radically transparent with each other?
279
820260
2240
13:45
People say it's emotionally difficult.
280
825220
2080
13:48
Critics say it's a formula for a brutal work environment.
281
828060
4240
13:53
Neuroscientists tell me it has to do with how are brains are prewired.
282
833220
4856
13:58
There's a part of our brain that would like to know our mistakes
283
838100
3216
14:01
and like to look at our weaknesses so we could do better.
284
841340
3960
14:05
I'm told that that's the prefrontal cortex.
285
845940
2440
14:08
And then there's a part of our brain which views all of this as attacks.
286
848860
4856
14:13
I'm told that that's the amygdala.
287
853740
1960
14:16
In other words, there are two you's inside you:
288
856260
3056
14:19
there's an emotional you
289
859340
1416
14:20
and there's an intellectual you,
290
860780
1776
14:22
and often they're at odds,
291
862580
1776
14:24
and often they work against you.
292
864380
1920
14:26
It's been our experience that we can win this battle.
293
866980
3736
14:30
We win it as a group.
294
870740
1320
14:32
It takes about 18 months typically
295
872820
2336
14:35
to find that most people prefer operating this way,
296
875180
3056
14:38
with this radical transparency
297
878260
2016
14:40
than to be operating in a more opaque environment.
298
880300
3336
14:43
There's not politics, there's not the brutality of --
299
883660
4296
14:47
you know, all of that hidden, behind-the-scenes --
300
887980
2376
14:50
there's an idea meritocracy where people can speak up.
301
890380
2936
14:53
And that's been great.
302
893340
1256
14:54
It's given us more effective work,
303
894620
1656
14:56
and it's given us more effective relationships.
304
896300
2400
14:59
But it's not for everybody.
305
899220
1320
15:01
We found something like 25 or 30 percent of the population
306
901500
2936
15:04
it's just not for.
307
904460
1736
15:06
And by the way,
308
906220
1216
15:07
when I say radical transparency,
309
907460
1816
15:09
I'm not saying transparency about everything.
310
909300
2336
15:11
I mean, you don't have to tell somebody that their bald spot is growing
311
911660
3816
15:15
or their baby's ugly.
312
915500
1616
15:17
So, I'm just talking about --
313
917140
2096
15:19
(Laughter)
314
919260
1216
15:20
talking about the important things.
315
920500
2176
15:22
So --
316
922700
1216
15:23
(Laughter)
317
923940
3200
15:28
So when you leave this room,
318
928420
1416
15:29
I'd like you to observe yourself in conversations with others.
319
929860
4440
15:35
Imagine if you knew what they were really thinking,
320
935180
3680
15:39
and imagine if you knew what they were really like ...
321
939580
2600
15:43
and imagine if they knew what you were really thinking
322
943660
3976
15:47
and what were really like.
323
947660
1840
15:49
It would certainly clear things up a lot
324
949980
2576
15:52
and make your operations together more effective.
325
952580
2856
15:55
I think it will improve your relationships.
326
955460
2240
15:58
Now imagine that you can have algorithms
327
958420
3296
16:01
that will help you gather all of that information
328
961740
3816
16:05
and even help you make decisions in an idea-meritocratic way.
329
965580
4560
16:12
This sort of radical transparency is coming at you
330
972460
4336
16:16
and it is going to affect your life.
331
976820
1960
16:19
And in my opinion,
332
979420
2056
16:21
it's going to be wonderful.
333
981500
1336
16:22
So I hope it is as wonderful for you
334
982860
2336
16:25
as it is for me.
335
985220
1200
16:26
Thank you very much.
336
986980
1256
16:28
(Applause)
337
988260
4360
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7