How to build a company where the best ideas win | Ray Dalio

1,388,663 views ・ 2017-09-06

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: SeungGyu Min κ²€ν† : SooAn Han
00:12
Whether you like it or not,
0
12380
1336
μ—¬λŸ¬λΆ„λ“€μ˜ 호뢈호λ₯Ό λ– λ‚˜μ„œ
00:13
radical transparency and algorithmic decision-making is coming at you fast,
1
13740
5376
ν˜μ‹ μ μΈ 투λͺ…μ„±κ³Ό μ•Œκ³ λ¦¬μ¦˜ 기반의 μ˜μ‚¬ 결정이 λΉ λ₯΄κ²Œ λ‹€κ°€μ˜€κ³  있고
00:19
and it's going to change your life.
2
19140
1976
이것듀이 우리의 삢을 λ°”κΏ€ κ²λ‹ˆλ‹€.
00:21
That's because it's now easy to take algorithms
3
21140
2816
μ΄μ œλŠ” μ•Œκ³ λ¦¬μ¦˜μ„
00:23
and embed them into computers
4
23980
1896
컴퓨터에 μ‹¬μ–΄μ„œ
00:25
and gather all that data that you're leaving on yourself
5
25900
2936
μ—¬λŸ¬λΆ„λ“€μ΄ μ—¬κΈ°μ €κΈ° 남긴
00:28
all over the place,
6
28860
1376
λͺ¨λ“  정보λ₯Ό λͺ¨μœΌκ³ 
00:30
and know what you're like,
7
30260
1696
μ—¬λŸ¬λΆ„λ“€μ΄ μ–΄λ–€ μ‚¬λžŒμΈμ§€ μ•Œμ•„λ‚΄μ„œ
00:31
and then direct the computers to interact with you
8
31980
2936
μ‚¬λžŒλ“€μ΄ ν•˜λŠ” 것보닀 더 효율적으둜 λ°©λ²•μœΌλ‘œ
00:34
in ways that are better than most people can.
9
34940
2120
컴퓨터가 μ—¬λŸ¬λΆ„λ“€κ³Ό μ†Œν†΅ν•˜λŠ” 것이 쉽기 λ•Œλ¬Έμž…λ‹ˆλ‹€.
00:37
Well, that might sound scary.
10
37980
1616
κΈ€μŽ„μš”, μ„¬λœ©ν•œ 이야기일 μˆ˜λ„ μžˆμŠ΅λ‹ˆλ‹€.
00:39
I've been doing this for a long time and I have found it to be wonderful.
11
39620
3640
ν•˜μ§€λ§Œ μ €λŠ” μ˜€λž«λ™μ•ˆ 이 일을 ν•΄μ™”κ³  μ•„μ£Ό ν™˜μƒμ μ΄λΌκ³  μƒκ°ν•©λ‹ˆλ‹€.
00:43
My objective has been to have meaningful work
12
43979
2657
μ§€κΈˆκΉŒμ§€μ˜ 제 일의 λͺ©μ μ€ μ˜λ―ΈμžˆλŠ” 일을 ν•˜κ³ 
00:46
and meaningful relationships with the people I work with,
13
46660
2856
같이 일을 ν•˜λŠ” μ‚¬λžŒλ“€κ³Ό μ˜λ―ΈμžˆλŠ” 관계λ₯Ό κ°€μ§€λŠ” κ²ƒμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
00:49
and I've learned that I couldn't have that
14
49540
2056
κ·ΈλŸ¬λ‚˜ μ €λŠ” 혁λͺ…적인 투λͺ…μ„±κ³Ό μ•Œκ³ λ¦¬λ“¬μ— κΈ°λ°˜ν•œ κ²°μ • μ—†μ΄λŠ”
00:51
unless I had that radical transparency and that algorithmic decision-making.
15
51620
4280
κ·Έ λͺ©μ μ„ 이루지 λͺ»ν•œλ‹€λŠ” 것을 κΉ¨λ‹¬μ•˜μŠ΅λ‹ˆλ‹€.
00:56
I want to show you why that is,
16
56500
2016
μ™œ κ·ΈλŸ°κ°€λ₯Ό μ—¬λŸ¬λΆ„λ“€μ—κ²Œ μ„€λͺ…λ“œλ¦¬λ©΄μ„œ
00:58
I want to show you how it works.
17
58540
1696
μž‘λ™μ›λ¦¬λ₯Ό μ•Œλ €λ“œλ¦¬κ³ μž ν•©λ‹ˆλ‹€.
01:00
And I warn you that some of the things that I'm going to show you
18
60260
3096
λ˜ν•œ, μ œκ°€ λ³΄μ—¬λ“œλ¦¬λŠ” 것듀 쀑 λͺ‡ κ°€μ§€λŠ”
λ‹€μ†Œ 좩격적인 것도 μžˆμŒμ„ λ§μ”€λ“œλ¦½λ‹ˆλ‹€.
01:03
probably are a little bit shocking.
19
63380
1667
01:05
Since I was a kid, I've had a terrible rote memory.
20
65580
3480
μ €λŠ” 어렸을 λ•Œ λ‹¨μˆœμ•”κΈ°λ ₯은 λΉ΅μ μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
01:09
And I didn't like following instructions,
21
69940
2176
μ €λŠ” μ‹œν‚€λŠ”λŒ€λ‘œ ν•˜λŠ” 게 μ‹«μ—ˆκ³ ,
01:12
I was no good at following instructions.
22
72140
2416
그런 것을 μž˜ν•˜μ§€λ„ λͺ»ν–ˆμŠ΅λ‹ˆλ‹€.
01:14
But I loved to figure out how things worked for myself.
23
74580
3000
ν•˜μ§€λ§Œ, μ €λŠ” μ €λ§Œμ˜ λ°©λ²•μœΌλ‘œ μ—¬λŸ¬ 일의 μž‘λ™μ›λ¦¬λ₯Ό μ•Œμ•„λ‚΄λŠ” 게 μ’‹μ•˜μŠ΅λ‹ˆλ‹€.
01:18
When I was 12,
24
78500
1376
μ œκ°€ 12μ‚΄μ΄μ—ˆμ„ λ•Œ
01:19
I hated school but I fell in love with trading the markets.
25
79900
3280
ν•™κ΅λŠ” μ‹«μ—ˆμ§€λ§Œ μ‹œμž₯의 거래 원리λ₯Ό μ•Œκ³  μ‹Άμ—ˆμŠ΅λ‹ˆλ‹€
01:23
I caddied at the time,
26
83740
1656
λ‹Ήμ‹œ 짐을 μš΄λ°˜ν–ˆμ—ˆλŠ”λ°
01:25
earned about five dollars a bag.
27
85420
1576
짐 ν•˜λ‚˜μ— 5λ‹¬λŸ¬ 정도 λ°›μ•˜μŠ΅λ‹ˆλ‹€.
01:27
And I took my caddying money, and I put it in the stock market.
28
87020
3200
κ·Έ λˆμ„ μ €λŠ” 주식에 νˆ¬μžν–ˆμŠ΅λ‹ˆλ‹€.
01:31
And that was just because the stock market was hot at the time.
29
91060
3376
λ‹Ήμ‹œμ—λŠ” μ£Όμ‹μ‹œμž₯이 잘 λ‚˜κ°ˆ λ•Œμ˜€μŠ΅λ‹ˆλ‹€.
01:34
And the first company I bought
30
94460
1456
μ œκ°€ 졜초둜 μ‚° νšŒμ‚¬λŠ”
01:35
was a company by the name of Northeast Airlines.
31
95940
2600
Northeast ν•­κ³΅μ‚¬μ˜ μ£Όμ‹μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
01:39
Northeast Airlines was the only company I heard of
32
99180
2736
Northeast ν•­κ³΅μ‚¬λŠ” μ£Όλ‹Ή 5λ‹¬λŸ¬ μ•„λž˜λ‘œ
01:41
that was selling for less than five dollars a share.
33
101940
2696
κ±°λž˜κ°€ λœλ‹€κ³  듀은 μœ μΌν•œ νšŒμ‚¬μ˜€μŠ΅λ‹ˆλ‹€.
01:44
(Laughter)
34
104660
1976
(μ›ƒμŒ)
01:46
And I figured I could buy more shares,
35
106660
1856
μ €λŠ” μ œκ°€ 주식을 더 μ‚΄ 수 μžˆμ—ˆκ³ 
01:48
and if it went up, I'd make more money.
36
108540
2096
가격이 였λ₯Έλ‹€λ©΄ 더 λ§Žμ€ λˆμ„ 벌 수 μžˆμŒμ„ κΉ¨λ‹¬μ•˜μŠ΅λ‹ˆλ‹€.
01:50
So, it was a dumb strategy, right?
37
110660
2840
바보라도 μ•Œ 수 μžˆλŠ” 사싀 μ•„λ‹Œκ°€μš”?
01:54
But I tripled my money,
38
114180
1456
근데 μ €λŠ” 3λ°°λ‚˜ λ²Œμ—ˆμŠ΅λ‹ˆλ‹€.
01:55
and I tripled my money because I got lucky.
39
115660
2120
운이 μ’‹μ•˜κΈ°μ— κ°€λŠ₯ν–ˆμ—ˆμ£ .
01:58
The company was about to go bankrupt,
40
118340
1816
κ·Έ νšŒμ‚¬λŠ” νŒŒμ‚°μ§μ „μ΄μ—ˆλŠ”λ°
02:00
but some other company acquired it,
41
120180
2096
μ–΄λ–€ λ‹€λ₯Έ νšŒμ‚¬κ°€ 합병을 ν–ˆκ³ 
02:02
and I tripled my money.
42
122300
1456
μ €λŠ” λˆμ„ 3배둜 뢈릴 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
02:03
And I was hooked.
43
123780
1200
그리고 μ €λŠ” 주식에 λΉ μ Έ λ²„λ ΈμŠ΅λ‹ˆλ‹€
02:05
And I thought, "This game is easy."
44
125540
2280
λ‹Ήμ‹œ μ €λŠ” μƒκ°ν–ˆμ£ . "주식은 κ»Œμ΄λ„€."
02:09
With time,
45
129020
1216
μ‹œκ°„μ΄ 흐λ₯΄λ©΄μ„œ
02:10
I learned this game is anything but easy.
46
130260
1960
μ£Όμ‹νˆ¬μžλŠ” μ ˆλŒ€ μ‰¬μš΄ 일이 μ•„λ‹˜μ„ μ•Œκ²Œ λμŠ΅λ‹ˆλ‹€.
02:12
In order to be an effective investor,
47
132700
2136
효과적인 νˆ¬μžκ°€κ°€ 되기 μœ„ν•΄μ„œλŠ”
02:14
one has to bet against the consensus
48
134860
2896
λ‚¨λ“€κ³ΌλŠ” λ‹€λ₯΄κ²Œ 걸되
02:17
and be right.
49
137780
1256
그게 μ˜³μ•„μ•Όλ§Œ ν•©λ‹ˆλ‹€.
02:19
And it's not easy to bet against the consensus and be right.
50
139060
2856
ν•˜μ§€λ§Œ 남듀과 λ‹€λ₯Έ 길을 κ°€λ©΄μ„œ μ˜³μ€ νŒλ‹¨μ„ 내리긴 쉽지 μ•ŠμŠ΅λ‹ˆλ‹€.
02:21
One has to bet against the consensus and be right
51
141940
2336
남듀과 λ‹€λ₯΄λ˜ μ˜³μ€ νŒλ‹¨μ„ λ‚΄λ €μ•Ό ν•˜λŠ” μ΄μœ λŠ”
02:24
because the consensus is built into the price.
52
144300
2640
ν•©μ˜κ°€ 주식 가격을 μ •ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
02:27
And in order to be an entrepreneur,
53
147940
2456
μ„±κ³΅ν•œ κΈ°μ—…κ°€κ°€
02:30
a successful entrepreneur,
54
150420
1616
되기 μœ„ν•΄μ„œλŠ”
02:32
one has to bet against the consensus and be right.
55
152060
3480
λ‹€λ₯Έ 생각을 가지고 ν˜„λͺ…ν•œ νŒλ‹¨μ„ ν•΄μ•Ό ν•©λ‹ˆλ‹€.
02:37
I had to be an entrepreneur and an investor --
56
157220
2936
μ €λŠ” κΈ°μ—…κ°€μ΄μž νˆ¬μžκ°€μ—¬μ•Ό ν–ˆμŠ΅λ‹ˆλ‹€.
02:40
and what goes along with that is making a lot of painful mistakes.
57
160180
4200
κ·Έ μΌλ“€μ—λŠ” κ³ ν†΅μŠ€λŸ¬μš΄ μ‹€μˆ˜λ“€μ΄ λ”°λ¦…λ‹ˆλ‹€.
02:45
So I made a lot of painful mistakes,
58
165260
2816
μ €λŠ” λ§Žμ€ μ‹€μˆ˜λ₯Ό ν–ˆμŠ΅λ‹ˆλ‹€.
02:48
and with time,
59
168100
1256
μ‹œκ°„μ΄ 흘러
02:49
my attitude about those mistakes began to change.
60
169380
2960
κ·Έλ™μ•ˆμ˜ μ‹€μˆ˜μ— λŒ€ν•œ μ €μ˜ νƒœλ„λŠ” λ³€ν™”ν•˜κΈ° μ‹œμž‘ν–ˆμŠ΅λ‹ˆλ‹€.
02:52
I began to think of them as puzzles.
61
172980
2096
μ €λŠ” μ‹€μˆ˜λ₯Ό 퍼즐이라고 μƒκ°ν•˜κΈ°λ‘œ ν–ˆμŠ΅λ‹ˆλ‹€.
02:55
That if I could solve the puzzles,
62
175100
1936
λ§Œμ•½ 퍼즐을 λ‹€ 맞좜 수 μžˆλ‹€λ©΄
02:57
they would give me gems.
63
177060
1440
μ•„λ§ˆ μ œκ°€ 상을 λ°›κ² μ£ .
02:58
And the puzzles were:
64
178980
1656
그리고 κ·Έ 퍼즐은 이런 κ²ƒμ΄μ—ˆμŠ΅λ‹ˆλ‹€.
03:00
What would I do differently in the future so I wouldn't make that painful mistake?
65
180660
3880
κ³ ν†΅μŠ€λŸ¬μš΄ μ‹€μˆ˜λ₯Ό ν•˜μ§€ μ•ŠκΈ° μœ„ν•΄μ„œ μ•žμœΌλ‘œ 무엇을 λ‹€λ₯΄κ²Œ ν•΄μ•Όν• κΉŒ?
03:05
And the gems were principles
66
185100
2576
그리고 κ·Έ λ³΄μƒμœΌλ‘œ 원칙을 배우게 λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
03:07
that I would then write down so I would remember them
67
187700
3136
λ‚΄κ°€ 그것듀을 κΈ°λ‘ν•œλ‹€λ©΄ 기얡을 ν•˜μ§€ μ•Šμ„κΉŒ.
03:10
that would help me in the future.
68
190860
1572
κ·Έ 기얡이 미래의 λ‚˜μ—κ²Œ 도움을 μ€„κ²ƒμ΄λ‹€λΌκ΅¬μš”.
03:12
And because I wrote them down so clearly,
69
192820
2696
κ·Έλž˜μ„œ μ œκ°€ μ„Έμ„Έν•˜κ²Œ μ‹€μˆ˜λ₯Ό κΈ°λ‘ν–ˆκΈ° λ•Œλ¬Έμ—
03:15
I could then --
70
195540
1336
μ €λŠ” κ·Έλ•Œ
03:16
eventually discovered --
71
196900
1576
λ§ˆμΉ¨λ‚΄ μ €λŠ”
03:18
I could then embed them into algorithms.
72
198500
3760
그것듀을 μ•Œκ³ λ¦¬μ¦˜μ— ν¬ν•¨μ‹œν‚¬ 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
03:23
And those algorithms would be embedded in computers,
73
203220
3456
κ·Έ μ•Œκ³ λ¦¬μ¦˜μ€ 컴퓨터에 μ €μž₯되고
03:26
and the computers would make decisions along with me;
74
206700
3336
저와 ν•¨κ»˜ μ˜μ‚¬κ²°μ •μ„ λ‚΄λ¦¬λŠ” 일을 ν•  수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
03:30
and so in parallel, we would make these decisions.
75
210060
3136
그리고 λ™μ‹œμ— μš°λ¦¬λŠ” μ΄λŸ¬ν•œ 결정을 내릴 수 μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
03:33
And I could see how those decisions then compared with my own decisions,
76
213220
3976
그리고 μ €λŠ” 이런 결정듀을 μ € μžμ‹ μ˜ κ²°μ •κ³Ό 비ꡐ해 λ³Ό 수 μžˆμ—ˆκ³ 
03:37
and I could see that those decisions were a lot better.
77
217220
3096
제 결정보닀 더 λ‚«λ‹€λŠ” 것을 μ•Œκ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
03:40
And that was because the computer could make decisions much faster,
78
220340
4736
μ΄λŠ” 컴퓨터가인간보닀 더 λΉ λ₯΄κ²Œ μ˜μ‚¬κ²°μ •μ„ ν•˜κ³ 
03:45
it could process a lot more information
79
225100
2256
νœ μ”¬ λ§Žμ€ 정보λ₯Ό μ²˜λ¦¬ν•  수 있고
03:47
and it can process decisions much more --
80
227380
3400
맀우 덜 κ°μ •μ μœΌλ‘œ
03:51
less emotionally.
81
231700
1200
μ˜μ‚¬ 결정듀을 μ²˜λ¦¬ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
03:54
So it radically improved my decision-making.
82
234580
3920
이 일은 μ €μ˜ μ˜μ‚¬κ²°μ • 과정을 λ°”κΎΈμ–΄ λ†“μ•˜μŠ΅λ‹ˆλ‹€.
04:00
Eight years after I started Bridgewater,
83
240260
4896
Bridgewaterλ₯Ό μ‹œμž‘ν•œμ§€ 8λ…„ 뒀에
04:05
I had my greatest failure,
84
245180
1536
졜고의 μ‹€νŒ¨μ΄μž
04:06
my greatest mistake.
85
246740
1200
졜고의 μ‹€μˆ˜λ₯Ό ν–ˆμŠ΅λ‹ˆλ‹€.
04:09
It was late 1970s,
86
249500
2136
1970λ…„ ν›„λ°˜μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
04:11
I was 34 years old,
87
251660
1976
κ·Έλ•Œ 제 λ‚˜μ΄λŠ” 34μ‚΄ μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
04:13
and I had calculated that American banks
88
253660
3656
λ‹Ήμ‹œ μ €λŠ” λ―Έκ΅­ 은행듀이 κ°œλ°œλ„μƒκ΅­λ“€μ—κ²Œ
04:17
had lent much more money to emerging countries
89
257340
2856
그듀이 κ°šμ„ μˆ˜λ„ 없을 만큼의 κΈˆμ•‘μ„
04:20
than those countries were going to be able to pay back
90
260220
2816
λŒ€μΆœν•΄μ€¬λ‹€κ³  νŒλ‹¨ν–ˆμŠ΅λ‹ˆλ‹€.
04:23
and that we would have the greatest debt crisis
91
263060
2696
그리고 λŒ€κ³΅ν™© μ΄ν›„λ‘œ μ΅œλŒ€μ˜ 뢀채 μœ„κΈ°λ₯Ό
04:25
since the Great Depression.
92
265780
1360
λ§žμ„ 것이라고 νŒλ‹¨ν–ˆμŠ΅λ‹ˆλ‹€.
04:28
And with it, an economic crisis
93
268020
2216
그와 λ”λΆˆμ–΄ κ²½μ œμœ„κΈ°κ°€ 였고
04:30
and a big bear market in stocks.
94
270260
2040
μ£Όμ‹μ‹œμž₯이 λ°”λ‹₯을 μΉ  것이라고 μ˜ˆμƒν–ˆμŠ΅λ‹ˆλ‹€.
04:33
It was a controversial view at the time.
95
273500
2000
λ‹Ήμ‹œμ— 그것은 λ…Όλž€μ΄ λ§Žμ€ κ²¬ν•΄μ˜€μŠ΅λ‹ˆλ‹€.
04:35
People thought it was kind of a crazy point of view.
96
275980
2440
μ‚¬λžŒλ“€μ€ 말도 μ•ˆλ˜λŠ” 어거지라 ν–ˆμŠ΅λ‹ˆλ‹€.
04:39
But in August 1982,
97
279300
2216
ν•˜μ§€λ§Œ 1982λ…„ 8μ›”
04:41
Mexico defaulted on its debt,
98
281540
1960
λ©•μ‹œμ½”κ°€ μ±„λ¬΄λΆˆμ΄ν–‰μ„ μ„ μ–Έν•˜κ²Œ 되고
04:44
and a number of other countries followed.
99
284340
2256
λ§Žμ€ κ°œλ°œλ„μƒκ΅­λ“€λ„ 동참을 ν•˜κ²Œ λ˜μ—ˆμ£ .
04:46
And we had the greatest debt crisis since the Great Depression.
100
286620
3400
경제 λŒ€κ³΅ν™© μ΄ν›„λŠ” μ΅œλŒ€μ˜ μœ„κΈ°μ˜€μŠ΅λ‹ˆλ‹€.
04:50
And because I had anticipated that,
101
290900
2776
ν•˜μ§€λ§Œ μ €λŠ” 이런 μ‚¬νƒœλ₯Ό 이미 μ˜ˆμΈ‘ν–ˆμ—ˆκΈ°μ—
04:53
I was asked to testify to Congress and appear on "Wall Street Week,"
102
293700
4336
μ˜νšŒμ— λ‚˜κ°€ 증언을 ν•˜κ³  μ£Όμš”κ²½μ œμ‹ λ¬Έμ— λ‚˜κ°€κ²Œ λ˜μ—ˆμ£ .
04:58
which was the show of the time.
103
298060
1976
μ‹œλŒ€μ˜ λͺ¨μŠ΅μ΄μ—ˆμ£ .
05:00
Just to give you a flavor of that, I've got a clip here,
104
300060
2936
μ–΄λ–€ λŠλ‚ŒμΈμ§€ λ³΄μ—¬λ“œλ¦¬κΈ° μœ„ν•΄ 짧은 μ˜μƒμ„ μ€€λΉ„ν–ˆμŠ΅λ‹ˆλ‹€.
05:03
and you'll see me in there.
105
303020
1920
μ œκ°€ λ‚˜μ˜€λŠ” μ˜μƒμž…λ‹ˆλ‹€.
05:06
(Video) Mr. Chairman, Mr. Mitchell,
106
306300
1696
(μ˜μƒ) μ‘΄κ²½ν•˜λŠ” 미첼의μž₯λ‹˜
05:08
it's a great pleasure and a great honor to be able to appear before you
107
308020
3376
여기에 μ„œκ²Œ 된 것을 μ˜κ΄‘μœΌλ‘œ μƒκ°ν•©λ‹ˆλ‹€.
05:11
in examination with what is going wrong with our economy.
108
311420
3480
우리 경제의 무엇이 잘λͺ»λ˜μ—ˆλŠ”지에 λŒ€ν•œ 연ꡬ에 λ”°λ₯΄λ©΄
05:15
The economy is now flat --
109
315460
1936
우리 κ²½μ œλŠ” μœ„κΈ°μ˜ μƒνƒœλ‘œ
05:17
teetering on the brink of failure.
110
317420
2136
거의 λΆ•κ΄΄ 직전이라 ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
05:19
Martin Zweig: You were recently quoted in an article.
111
319580
2496
Martin Zweig: ν•œ μ‹ λ¬Έ κΈ°μ‚¬μ—μ„œ λ‹Ήμ‹ μ˜ 말을 μΈμš©ν–ˆλŠ”λ°μš”.
05:22
You said, "I can say this with absolute certainty
112
322100
2336
κ·Έλ•Œ "λ‚˜λŠ” 이 일을 이미 ν™•μ‹ ν•˜κ³  μžˆμ—ˆλ‹€.
05:24
because I know how markets work."
113
324460
1616
μ‹œμž₯의 원리λ₯Ό 이미 κ°„νŒŒν–ˆκΈ° λ•Œλ¬Έμ΄λ‹€." λΌκ³ μš”.
05:26
Ray Dalio: I can say with absolute certainty
114
326100
2096
Ray Dalio: "λ‚˜λŠ” ν™•μ‹ μ»¨λŒ€
05:28
that if you look at the liquidity base
115
328220
1856
λ§Œμ•½ μ—¬λŸ¬λΆ„λ“€μ΄ κΈ°μ—…κ³Ό μ „μ„Έκ³„μ˜
05:30
in the corporations and the world as a whole,
116
330100
3376
μœ λ™μ„±μ„ ν•˜λ‚˜μ˜ κ΄€μ μœΌλ‘œ μ‚΄νŽ΄λ³Έλ‹€λ©΄
05:33
that there's such reduced level of liquidity
117
333500
2096
ν˜„κΈˆ μœ λ™μ„±μ˜ 정도가 많이 κ°μ†Œκ°€ 된 μƒνƒœλΌ
05:35
that you can't return to an era of stagflation."
118
335620
3216
μŠ€νƒœκ·Έν”Œλ ˆμ΄μ…˜ μ‹œλŒ€λ‘œ λ˜λŒμ•„ 갈 수 μ—†μŠ΅λ‹ˆλ‹€."
05:38
I look at that now, I think, "What an arrogant jerk!"
119
338860
3096
κ·Έ λͺ¨μŠ΅μ„ 보면 μ§€κΈˆλ„ "μ €λŸ° κ±°λ§Œν•œ 녀석"이라고 μƒκ°ν•©λ‹ˆλ‹€.
05:41
(Laughter)
120
341980
2000
(μ›ƒμŒ)
05:45
I was so arrogant, and I was so wrong.
121
345580
2456
λ„ˆλ¬΄ 건방지고 ν—›μ†Œλ¦¬λ₯Ό ν•΄λŒ”μŠ΅λ‹ˆλ‹€.
05:48
I mean, while the debt crisis happened,
122
348060
2576
λΆ€μ±„μœ„κΈ°κ°€ λ°œμƒν–ˆμ„λ•Œ
05:50
the stock market and the economy went up rather than going down,
123
350660
3976
μ£Όμ‹μ‹œμž₯κ³Ό κ·Έ κ²½μ œλŠ” ν•˜λ½λ³΄λ‹€λŠ” μƒμŠΉν–ˆμŠ΅λ‹ˆλ‹€.
05:54
and I lost so much money for myself and for my clients
124
354660
5016
저와 제 고객듀은 λ§Žμ€ 손해λ₯Ό μž…μ—ˆμŠ΅λ‹ˆλ‹€.
05:59
that I had to shut down my operation pretty much,
125
359700
3416
λ§Žμ€ 사업을 μ ‘μ–΄μ•Ό ν–ˆμŠ΅λ‹ˆλ‹€.
06:03
I had to let almost everybody go.
126
363140
1880
직원듀이 λ– λ‚˜λŠ” 것을 μ§€μΌœλ³Ό 수 밖에 μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
06:05
And these were like extended family,
127
365460
1736
그듀은 λ˜λ‹€λ₯Έ κ°€μ‘±κ³Ό κ°™μ•˜μŠ΅λ‹ˆλ‹€.
06:07
I was heartbroken.
128
367220
1616
ν•˜λŠ˜μ΄ λ¬΄λ„ˆμ§€λŠ” 것 κ°™μ•˜μŠ΅λ‹ˆλ‹€.
06:08
And I had lost so much money
129
368860
1816
λ„ˆλ¬΄ λ§Žμ€ λˆμ„ λ‚ λ €λ²„λ ΈμŠ΅λ‹ˆλ‹€.
06:10
that I had to borrow 4,000 dollars from my dad
130
370700
3336
심지어 μ•„λ²„μ§€λ‘œλΆ€ν„° 4,000λ‹¬λŸ¬λ₯Ό λΉŒλ €μ•Ό ν•  μ •λ„μ˜€μœΌλ‹ˆκΉŒμš”.
06:14
to help to pay my family bills.
131
374060
1920
μƒν™œλΉ„κ°€ μ—†μ—ˆκ±°λ“ μš”.
06:16
It was one of the most painful experiences of my life ...
132
376660
3160
κ·Έ 사건은 λ‚΄ 인생 μ΅œλŒ€μ˜ μ‹€νŒ¨μž‘μ΄μ—ˆμŠ΅λ‹ˆλ‹€...
06:21
but it turned out to be one of the greatest experiences of my life
133
381060
3776
ν•˜μ§€λ§Œ, κ²°κ΅­μ—” κ·Έ μ‚¬κ±΄μœΌλ‘œ μ €λŠ” λ§Žμ€ 것을 λ°°μ› μŠ΅λ‹ˆλ‹€.
06:24
because it changed my attitude about decision-making.
134
384860
2680
μ˜μ‚¬κ²°μ •μ— λŒ€ν•˜λŠ” 제 νƒœλ„κ°€ λ‹¬λΌμ‘ŒμŠ΅λ‹ˆλ‹€.
06:28
Rather than thinking, "I'm right,"
135
388180
3056
"λ‚΄κ°€ μ˜³μ•„"라고 μƒκ°ν•˜κΈ° 보닀
06:31
I started to ask myself,
136
391260
1576
μ €λŠ” μŠ€μŠ€λ‘œμ—κ²Œ 묻기 μ‹œμž‘ν–ˆμŠ΅λ‹ˆλ‹€.
06:32
"How do I know I'm right?"
137
392860
1800
"λ‚΄ 결정이 μ˜³λ‹€λŠ” κ·Όκ±°κ°€ 무엇인가?"
06:36
I gained a humility that I needed
138
396300
1936
μ €λŠ” λŒ€λ‹΄ν•¨μ˜ κ· ν˜•μ„ 작기 μœ„ν•œ
06:38
in order to balance my audacity.
139
398260
2560
겸손을 λ°°μ› μŠ΅λ‹ˆλ‹€.
06:41
I wanted to find the smartest people who would disagree with me
140
401700
4216
제 μ˜κ²¬μ— "No"라고 ν•  수 μžˆλŠ” λ˜‘λ˜‘ν•œ μ‚¬λžŒμ„ λ§Œλ‚˜λ €κ³  λ…Έλ ₯ν–ˆμ£ .
06:45
to try to understand their perspective
141
405940
1896
그듀은 μ–΄λ–€ 관점을 가지고 μžˆλŠ”μ§€ μ΄ν•΄ν•˜λ €κ³  ν–ˆμŠ΅λ‹ˆλ‹€.
06:47
or to have them stress test my perspective.
142
407860
2600
또 κ·Έλ“€μ—κ²Œ 제 관점을 μ‹œν—˜ν•˜κ²Œ ν–ˆμŠ΅λ‹ˆλ‹€.
06:51
I wanted to make an idea meritocracy.
143
411220
2776
μ €λŠ” 아이디어 μ‹€λ ₯주의λ₯Ό λ§Œλ“€κ³  μ‹Άμ—ˆμŠ΅λ‹ˆλ‹€.
06:54
In other words,
144
414020
1216
μ‰½κ²Œ λ§μ”€λ“œλ¦¬λ©΄
06:55
not an autocracy in which I would lead and others would follow
145
415260
3816
μ œκ°€ 이끌고 남이 λ”°λΌμ˜€λŠ” λ…μž¬λ„ μ•„λ‹ˆκ³ 
06:59
and not a democracy in which everybody's points of view were equally valued,
146
419100
3616
λͺ¨λ“  μ‚¬λžŒλ“€μ˜ 관점이 λ˜‘κ°™μ΄ ν‰κ°€λ˜λŠ” λ―Όμ£Όμ£Όμ˜λ„ μ•„λ‹Œ
07:02
but I wanted to have an idea meritocracy in which the best ideas would win out.
147
422740
5096
κ°€μž₯ 쒋은 아이디어가 μ‚΄μ•„λ‚¨λŠ” 아이디어 μ‹€λ ₯주의λ₯Ό λ§Œλ“€κ³  μ‹Άμ—ˆμŠ΅λ‹ˆλ‹€.
07:07
And in order to do that,
148
427860
1256
κ·Έ κΏˆμ„ 이루기 μœ„ν•΄μ„œ
07:09
I realized that we would need radical truthfulness
149
429140
3576
μ €λŠ” 급진적인 정직성과
07:12
and radical transparency.
150
432740
1616
급진적인 투λͺ…μ„±μ˜ ν•„μš”μ„±μ„ μ•Œκ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
07:14
What I mean by radical truthfulness and radical transparency
151
434380
3856
μ œκ°€ λ§ν•˜λŠ” 급진적인 정직성과 투λͺ…μ„±μ΄λž€
07:18
is people needed to say what they really believed
152
438260
2656
μžμ‹ λ“€μ΄ λ―ΏλŠ” 것을 말할 수 μžˆμ–΄μ•Όν•˜κ³ 
07:20
and to see everything.
153
440940
2000
λͺ¨λ“  것을 λ³Ό 수 μžˆμ–΄μ•Ό ν•œλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
07:23
And we literally tape almost all conversations
154
443300
3936
μš°λ¦¬λŠ” λ§κ·ΈλŒ€λ‘œ λͺ¨λ“  λŒ€ν™”λ₯Ό λ…ΉμŒν•˜κ³ 
07:27
and let everybody see everything,
155
447260
1616
λͺ¨λ“  이듀이 λ³Ό 수 μžˆλ„λ‘ ν•©λ‹ˆλ‹€
07:28
because if we didn't do that,
156
448900
1416
κ·Έ μ΄μœ λŠ” κ·Έλ ‡κ²Œ ν•˜μ§€ μ•ŠλŠ”λ‹€λ©΄
07:30
we couldn't really have an idea meritocracy.
157
450340
3080
아이디어 μ‹€λ ₯주의 μ‹€ν˜„μ€ λΆˆκ°€λŠ₯ν•˜κΈ° λ•Œλ¬Έμž…λ‹ˆλ‹€.
07:34
In order to have an idea meritocracy,
158
454580
3696
아이디어 μ‹€λ ₯주의λ₯Ό μ‹€ν˜„ν•˜κΈ° μœ„ν•΄μ„œ
07:38
we have let people speak and say what they want.
159
458300
2376
μš°λ¦¬λŠ” μ‚¬λžŒλ“€μ΄ μ›ν•˜λŠ” 것을 자유둭게 λ§ν•˜κ²Œ ν•©λ‹ˆλ‹€.
07:40
Just to give you an example,
160
460700
1376
예λ₯Ό ν•˜λ‚˜ λ“ λ‹€λ©΄
07:42
this is an email from Jim Haskel --
161
462100
2696
Jim Haskel 보낸 λ©”μΌμΈλ°μš”.
07:44
somebody who works for me --
162
464820
1376
우리 μ§μ›μž…λ‹ˆλ‹€.
07:46
and this was available to everybody in the company.
163
466220
3376
우리 νšŒμ‚¬μ˜ λˆ„κ΅¬λ‚˜ λ³Ό 수 μžˆλŠ” λ‚΄μš©μž…λ‹ˆλ‹€.
07:49
"Ray, you deserve a 'D-'
164
469620
2536
"Ray, 였늘 νšŒμ˜μ—μ„œ
07:52
for your performance today in the meeting ...
165
472180
2256
당신이 ν•œ λ°œν‘œμ— D-λ“œλ¦΄κ»˜μš”.
07:54
you did not prepare at all well
166
474460
1696
νšŒμ˜μ€€λΉ„λ₯Ό μ „ν˜€ ν•˜μ§€ μ•Šμ•˜λ”κ΅°μš”.
07:56
because there is no way you could have been that disorganized."
167
476180
3560
그렇지 μ•Šμ•˜μœΌλ©΄ κ·Έλ ‡κ²Œ μ •λ¦¬λ˜μ§€ μ•Šμ•˜μ„λ¦¬κ°€ μ—†μŠ΅λ‹ˆλ‹€."
08:01
Isn't that great?
168
481340
1216
멋지지 μ•ŠμŠ΅λ‹ˆκΉŒ?
08:02
(Laughter)
169
482580
1216
(μ›ƒμŒ)
08:03
That's great.
170
483820
1216
μ°Έ λ†€λžμ£ !
08:05
It's great because, first of all, I needed feedback like that.
171
485060
2936
무엇보닀도 μ „ 이런 μ‹μ˜ ν”Όλ“œλ°±μ΄ ν•„μš”ν•©λ‹ˆλ‹€.
08:08
I need feedback like that.
172
488020
1616
μ „ 그런 ν”Όλ“œλ°±μ΄ ν•„μš”ν•©λ‹ˆλ‹€.
08:09
And it's great because if I don't let Jim, and people like Jim,
173
489660
3456
쒋은 점이 μžˆλ‹€λ©΄ Jim μ΄λ‚˜ λ‹€λ₯Έ 직원듀이
08:13
to express their points of view,
174
493140
1576
자기 μ˜κ²¬μ„ λ§ν•˜μ§€ λͺ»ν•˜κ²Œ ν•œλ‹€λ©΄
08:14
our relationship wouldn't be the same.
175
494740
2056
저와 μ§μ›λ“€μ˜ κ΄€κ³„λŠ” μ§€κΈˆκ³ΌλŠ” λ‹¬λΌμ§ˆ κ²ƒμž…λ‹ˆλ‹€.
08:16
And if I didn't make that public for everybody to see,
176
496820
3056
그리고 λ§Œμ•½ λͺ¨λ“  것을 κ³΅κ°œν•˜μ§€ μ•ŠλŠ”λ‹€λ©΄
08:19
we wouldn't have an idea meritocracy.
177
499900
1960
μš°λ¦¬λŠ” 아이디어 μ‹€λ ₯주의λ₯Ό 갖좔지 λͺ» ν•  κ²ƒμž…λ‹ˆλ‹€.
08:23
So for that last 25 years that's how we've been operating.
178
503580
3280
μ§€λ‚œ 25λ…„ λ™μ•ˆ μš°λ¦¬λŠ” 이것을 μ‹€ν–‰ν•΄ μ™”μŠ΅λ‹ˆλ‹€.
08:27
We've been operating with this radical transparency
179
507460
3056
μš°λ¦¬λŠ” μ™„μ „ν•œ 투λͺ…성을 가지고 일을 ν•΄ μ™”κ³ 
08:30
and then collecting these principles,
180
510540
2296
그리고 κ·Έ 원칙을 μˆ˜μ§‘ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
08:32
largely from making mistakes,
181
512860
2056
주둜 μ‹€νŒ¨μ—μ„œ 얻은 κ΅ν›ˆλ“€μž…λ‹ˆλ‹€.
08:34
and then embedding those principles into algorithms.
182
514940
4416
그리고, κ·Έ 원칙듀을 μ•Œκ³ λ¦¬λ“¬ν™” ν–ˆμŠ΅λ‹ˆλ‹€.
08:39
And then those algorithms provide --
183
519380
2696
κ·Έ μ•Œκ³ λ¦¬μ¦˜μ€ 이제 μš°λ¦¬λ“€μ΄
08:42
we're following the algorithms
184
522100
2016
ν˜„μž¬ μˆ˜ν–‰ν•˜κ³  μžˆλŠ” λͺ¨λ“  μ˜μ‚¬κ²°μ •κ³Όμ •μ—μ„œ
08:44
in parallel with our thinking.
185
524140
1440
우리의 νŒλ‹¨μ„ 도와 일을 ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
08:47
That has been how we've run the investment business,
186
527100
3176
μ§€κΈˆκΉŒμ§€ 이런 λ°©μ‹μœΌλ‘œ νˆ¬μžνšŒμ‚¬λ₯Ό μš΄μ˜ν•˜κ³  있고,
08:50
and it's how we also deal with the people management.
187
530300
2736
λ˜ν•œ 인사관리도 κ°™μ€λ°©μ‹μœΌλ‘œ ν•˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
08:53
In order to give you a glimmer into what this looks like,
188
533060
3736
μ–΄λ–€ λͺ¨μŠ΅μΈμ§€ μ—¬λŸ¬λΆ„λ“€κ»˜ 잠깐 λ³΄μ—¬λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
08:56
I'd like to take you into a meeting
189
536820
2336
λ¨Όμ € μ—¬λŸ¬λΆ„λ“€μ„ νšŒμ˜μ‹€λ‘œ μ•ˆλ‚΄ν•΄
08:59
and introduce you to a tool of ours called the "Dot Collector"
190
539180
3136
'Dot Collector'λ₯Ό μ†Œκ°œν•˜κ² μŠ΅λ‹ˆλ‹€.
09:02
that helps us do this.
191
542340
1280
우리 일을 도와주고 μžˆμŠ΅λ‹ˆλ‹€.
09:07
A week after the US election,
192
547460
2176
λ―Έ λŒ€μ„  일주일 ν›„
09:09
our research team held a meeting
193
549660
2096
우리 μ—°κ΅¬νŒ€μ€ 회의λ₯Ό ν–ˆμŠ΅λ‹ˆλ‹€.
09:11
to discuss what a Trump presidency would mean for the US economy.
194
551780
3320
νŠΈλŸΌν”„ μ •λΆ€κ°€ λ―Έκ΅­κ²½μ œμ— λΌμΉ˜λŠ” 영ν–₯을 λΆ„μ„ν–ˆμŠ΅λ‹ˆλ‹€.
09:15
Naturally, people had different opinions on the matter
195
555820
2856
λ‹Ήμ—°νžˆ, λ§Žμ€ 이견듀이 λ‚œλ¬΄ν–ˆμ£ .
09:18
and how we were approaching the discussion.
196
558700
2040
우리의 ν† λ‘  방식은 늘 이런 μ‹μž…λ‹ˆλ‹€.
09:21
The "Dot Collector" collects these views.
197
561660
2776
"Dot Collector' κ°€ 우리의 μ˜κ²¬μ„ λͺ¨λ‘ μˆ˜μ§‘ν•©λ‹ˆλ‹€.
09:24
It has a list of a few dozen attributes,
198
564460
2296
κ·Έ μ»΄ν“¨ν„°λŠ” λ§Žμ€ νŠΉμ§•μ΄ μžˆμŠ΅λ‹ˆλ‹€.
09:26
so whenever somebody thinks something about another person's thinking,
199
566780
4016
μ–΄λ–€ 이가 λ‹€λ₯Έ μ‚¬λžŒμ— λŒ€ν•œ 무엇인가λ₯Ό 생각할 λ•Œλ§ˆλ‹€
09:30
it's easy for them to convey their assessment;
200
570820
2936
μžμ‹ λ“€μ˜ 평가λ₯Ό μ „λ‹¬ν•˜λŠ” 것이 μ‰¬μ–΄μ‘ŒμŠ΅λ‹ˆλ‹€.
09:33
they simply note the attribute and provide a rating from one to 10.
201
573780
4520
그듀은 λ‹¨μˆœνžˆ νŠΉμ§•μ„ μ£Όλͺ©ν• λΏ μ•„λ‹ˆλΌ 1μ—μ„œ 10κΉŒμ§€ 등급을 λ§€κΉλ‹ˆλ‹€.
09:39
For example, as the meeting began,
202
579340
2256
예λ₯Ό λ“€μ–΄ νšŒμ˜κ°€ μ‹œμž‘λ˜λ©΄
09:41
a researcher named Jen rated me a three --
203
581620
3120
μ  μ΄λΌλŠ” 연ꡬ원이 λ‚˜μ—κ²Œ 점수 3 점을 μ€¬μŠ΅λ‹ˆλ‹€.
09:45
in other words, badly --
204
585460
2016
μ‰½κ²Œ λ§ν•˜λ©΄, μ—‰λ§μ΄λž€ μ–˜κΈ°μ£ .
09:47
(Laughter)
205
587500
1376
(μ›ƒμŒ)
09:48
for not showing a good balance of open-mindedness and assertiveness.
206
588900
4160
μ΄μœ λŠ” μ œκ°€ κ°œλ°©μ μ΄μ§€ λͺ»ν•˜κ³  확신을 주지 λͺ»ν–ˆλ‹€λŠ” κ²λ‹ˆλ‹€.
09:53
As the meeting transpired,
207
593900
1456
νšŒμ˜κ°€ 계속 μ§„ν–‰λ˜λ©΄μ„œ
09:55
Jen's assessments of people added up like this.
208
595380
3240
μ‚¬λžŒλ“€μ— λŒ€ν•œ 젠의 μ΅œμ’…ν‰κ°€κ°€ μ΄κ²ƒμž…λ‹ˆλ‹€.
09:59
Others in the room have different opinions.
209
599740
2176
방에 있던 λ‹€λ₯Έ μ‚¬λžŒλ“€λ„ 각기 λ‹€λ₯Έ 점수λ₯Ό 맀기게 λ˜μ—ˆμ£ .
10:01
That's normal.
210
601940
1216
늘 ν•˜λŠ” κ³Όμ •μž…λ‹ˆλ‹€.
10:03
Different people are always going to have different opinions.
211
603180
2920
λ‹€λ₯Έ μ‚¬λžŒλ“€μ€ 항상 λ‹€λ₯Έ μ˜κ²¬μ„ 가지고 μžˆμŠ΅λ‹ˆλ‹€.
10:06
And who knows who's right?
212
606620
1400
λˆ„κ°€ μ˜³μ€μ§€λŠ” 아무도 λͺ¨λ¦…λ‹ˆλ‹€.
10:09
Let's look at just what people thought about how I was doing.
213
609060
3440
μ‚¬λžŒλ“€μ΄ μ œκ°€ μ–΄λ–»κ²Œ ν–ˆλ‹€κ³  μƒκ°ν•˜λŠ”μ§€ 보도둝 ν•˜κ² μŠ΅λ‹ˆλ‹€.
10:13
Some people thought I did well,
214
613420
2216
μ–΄λ–€ μ‚¬λžŒμ€ μž˜ν–ˆλ‹€κ³  ν•˜κ³ 
10:15
others, poorly.
215
615660
1200
μ–΄λ–€ μ‚¬λžŒμ€ ν˜•νŽΈμ—†μ—ˆλ‹€κ³  ν–ˆλ„€μš”.
10:17
With each of these views,
216
617900
1336
각기 μžμ‹ λ“€μ˜ κ΄€μ μ—μ„œ 점수λ₯Ό μ€€κ²ƒμž…λ‹ˆλ‹€.
10:19
we can explore the thinking behind the numbers.
217
619260
2320
μš°λ¦¬λŠ” 점수 뒀에 μˆ¨μ–΄μžˆλŠ” 정보λ₯Ό 읽어야 ν•©λ‹ˆλ‹€.
10:22
Here's what Jen and Larry said.
218
622340
2160
여기에 μ  κ³Ό λ ˆλ¦¬κ°€ λ§ν–ˆμŠ΅λ‹ˆλ‹€.
10:25
Note that everyone gets to express their thinking,
219
625580
2616
λͺ¨λ“  이듀이 μžμ‹ λ“€μ˜ 생각을 λ°œν‘œν•˜λŠ” 점에 μ£Όλͺ©μ„ ν•΄μ£Όμ„Έμš”.
10:28
including their critical thinking,
220
628220
1656
λΉ„νŒμ  사고도 μƒκ΄€μ—†μŠ΅λ‹ˆλ‹€.
10:29
regardless of their position in the company.
221
629900
2120
νšŒμ‚¬λ‚΄μ˜ μ§μœ„λ„ 상관 μ—†κ΅¬μš”.
10:32
Jen, who's 24 years old and right out of college,
222
632940
3096
젠은 24살이고 λŒ€ν•™μ„ 막 μ‘Έμ—…ν–ˆμŠ΅λ‹ˆλ‹€.
10:36
can tell me, the CEO, that I'm approaching things terribly.
223
636060
2840
κ·Έλ…€κ°€ 회μž₯인 μ €μ—κ²Œ 접근방법이 ν˜•νŽΈμ—†λ‹€κ³  ν•©λ‹ˆλ‹€.
10:40
This tool helps people both express their opinions
224
640300
3776
이런 방법은 μ‚¬λžŒλ“€μ˜ μ˜κ²¬λ°œν‘œλ₯Ό 도와주고
10:44
and then separate themselves from their opinions
225
644100
3096
μžμ‹ κ³Ό μžμ‹ μ˜ 의견과의 뢄리λ₯Ό λ„μ™€μ„œ
10:47
to see things from a higher level.
226
647220
2040
μ’€ 더 κ³ μ°¨μ›μ μœΌλ‘œ 일을 보게 λ©λ‹ˆλ‹€.
10:50
When Jen and others shift their attentions from inputting their own opinions
227
650460
4896
μ  κ³Ό λ‹€λ₯Έ 직원듀이 μžμ‹ λ“€μ˜ μ˜κ²¬μ„ μžμ‹ μ˜ ν‹€μ—μ„œ μ œκ±°ν•΄μ„œ
10:55
to looking down on the whole screen,
228
655380
2576
전체적인 λ°°κ²½μ—μ„œ λ‚΄λ €λ‹€ λ³΄λŠ” κ²λ‹ˆλ‹€.
10:57
their perspective changes.
229
657980
1720
슀슀둜의 κ΄€μ μ˜ λ³€ν™”λ₯Ό μœ λ„ν•˜κ²Œ ν•©λ‹ˆλ‹€.
11:00
They see their own opinions as just one of many
230
660500
3136
직원듀은 μžμ‹ λ“€μ˜ 아이디어가 μ „μ²΄μ˜ λΆ€λΆ„μž„μ„ μ•Œκ²Œ 되고
11:03
and naturally start asking themselves,
231
663660
2536
μžμ—°μŠ€λŸ½κ²Œ μŠ€μŠ€λ‘œμ—κ²Œ μ§ˆλ¬Έμ„ λ˜μ§€κΈ° μ‹œμž‘ν•©λ‹ˆλ‹€.
11:06
"How do I know my opinion is right?"
232
666220
2000
"μ–΄λ–»κ²Œ λ‚΄ 생각이 μ˜³λ‹€κ³  ν™•μ‹ ν•˜λŠ”κ°€?"
11:09
That shift in perspective is like going from seeing in one dimension
233
669300
4056
μ΄λŸ¬ν•œ κ΄€μ μ˜ 변화은 마치 일차적 κ΄€μ μ—μ„œ
11:13
to seeing in multiple dimensions.
234
673380
2256
λ‹€μˆ˜μ˜ κ΄€μ μœΌλ‘œ λ³΄λŠ” κ²ƒμœΌλ‘œ λ³€ν™”ν•˜λŠ” 것과 κ°™μŠ΅λ‹ˆλ‹€.
11:15
And it shifts the conversation from arguing over our opinions
235
675660
4096
그리고 이런 λ³€ν™”λŠ” μ–ΈμŸλ³΄λ‹€λŠ”
11:19
to figuring out objective criteria for determining which opinions are best.
236
679780
4400
μ–΄λŠ 의견이 μ΅œμ μΈμ§€μ— λŒ€ν•œ 평가 λͺ©ν‘œμ˜ 기쀀을 μ œμ‹œν•©λ‹ˆλ‹€.
11:24
Behind the "Dot Collector" is a computer that is watching.
237
684740
3600
'Dot Collector"μ΄λ©΄μ—λŠ” κ°μ‹œν•˜λŠ” 컴퓨터가 μžˆμŠ΅λ‹ˆλ‹€.
11:28
It watches what all these people are thinking
238
688940
2176
이것이 우리 λͺ¨λ‘μ˜ 생각을 μ§€μΌœλ³΄κ³ 
11:31
and it correlates that with how they think.
239
691140
2576
μš°λ¦¬κ°€ μƒκ°ν•˜λŠ” 방법에 μžˆμ–΄ μƒν˜Έμ—°κ΄€μ„±μ„ μ°Ύμ•„λƒ…λ‹ˆλ‹€.
11:33
And it communicates advice back to each of them based on that.
240
693740
3520
그리고 λ°œκ²¬μ— κΈ°μ΄ˆν•œ 쑰언을 μ „λ‹¬ν•˜μ£ .
11:38
Then it draws the data from all the meetings
241
698340
3416
그런 ν›„ λͺ¨λ“  회의의 자료λ₯Ό μˆ˜μ§‘ν•΄μ„œ
11:41
to create a pointilist painting of what people are like
242
701780
3216
λͺ¨λ“  μ΄λ“€μ˜ ν˜„μž¬ λͺ¨μŠ΅κ³Ό μ–΄λ–»κ²Œ μƒκ°ν•˜λŠ”κ°€μ— λŒ€ν•΄
11:45
and how they think.
243
705020
1240
점으둜 ν‘œμ‹œν•΄ μ€λ‹ˆλ‹€.
11:46
And it does that guided by algorithms.
244
706980
2720
이 λͺ¨λ“  것듀은 μ•Œκ³ λ¦¬μ¦˜μ— κΈ°μ΄ˆν•©λ‹ˆλ‹€.
11:50
Knowing what people are like helps to match them better with their jobs.
245
710620
3760
κ·Έλ“€μ˜ ν˜„μž¬ 생각을 μ΄ν•΄ν•¨μœΌλ‘œμ¨ 더 μ ν•©ν•œ 일에 λ°°μΉ˜ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
11:54
For example,
246
714940
1216
예λ₯Ό λ“€μ–΄
11:56
a creative thinker who is unreliable
247
716180
1736
신뒰성은 λΆ€μ‘±ν•˜μ§€λ§Œ μ°½μ˜μ„±μ΄ λ›°μ–΄λ‚œ 직원은
11:57
might be matched up with someone who's reliable but not creative.
248
717940
3080
μ„±μ‹€ν•˜μ§€λ§Œ μ°½μ˜μ„±μ΄ λΆ€μ‘±ν•œ 직원과 λ§€μΉ˜κ°€ κ°€λŠ₯ν•΄μ§‘λ‹ˆλ‹€.
12:02
Knowing what people are like also allows us to decide
249
722100
3336
λ˜ν•œ μ‚¬λžŒλ“€μ˜ 생각을 μ½λŠ”λ‹€λŠ” 것은
12:05
what responsibilities to give them
250
725460
2256
우리둜 ν•˜μ—¬κΈˆ κ·Έλ“€μ—κ²Œ μ–΄λ–€ μ±…μž„μ„ λΆ€μ—¬ν•΄μ•Ό ν•˜λŠ”μ§€λ₯Ό κ²°μ •ν•˜κ²Œ ν•˜κ³ 
12:07
and to weigh our decisions based on people's merits.
251
727740
3480
μ§μ›λ“€μ˜ μž₯점에 기초λ₯Ό 두고 우리의 결정을 경쀑을 λ”°μ Έ λ³Ό μˆ˜κ°€ μžˆμŠ΅λ‹ˆλ‹€.
12:11
We call it their believability.
252
731860
1600
μš°λ¦¬λŠ” 그것을 신뒰성이라고 λΆ€λ¦…λ‹ˆλ‹€.
12:14
Here's an example of a vote that we took
253
734380
1976
μš°λ¦¬κ°€ ν–ˆλ˜ νˆ¬ν‘œκ²°κ³Όλ₯Ό λ³΄μ—¬λ“œλ¦¬κ² μŠ΅λ‹ˆλ‹€.
12:16
where the majority of people felt one way ...
254
736380
2840
λŒ€λΆ€λΆ„μ˜ 직원듀이 이런 μ‹μœΌλ‘œ μƒκ°ν–ˆλ‹€λŠ” 것을 λ³΄μ—¬μ£ΌλŠ” κ²°κ³Όμž…λ‹ˆλ‹€.
12:20
but when we weighed the views based on people's merits,
255
740740
2936
ν•˜μ§€λ§Œ 각자의 λ›°μ–΄λ‚œ 뢀뢄에 κΈ°λ°˜ν•΄μ„œ κ°€μ€‘μΉ˜λ₯Ό μ‘°μ •ν–ˆλ”λ‹ˆ
12:23
the answer was completely different.
256
743700
1840
κ²°κ³ΌλŠ” μ™„μ „νžˆ λ‹¬λžμŠ΅λ‹ˆλ‹€.
12:26
This process allows us to make decisions not based on democracy,
257
746740
4576
μ΄λŸ¬ν•œ 과정은 우리둜 ν•˜μ—¬κΈˆ λ―Όμ£Όμ£Όμ˜λ„ μ•„λ‹Œ, λ…μž¬λ„ μ•„λ‹Œ
12:31
not based on autocracy,
258
751340
2136
μ‚¬λžŒλ“€μ˜ 신뒰성을 κ³ λ €ν•œ
12:33
but based on algorithms that take people's believability into consideration.
259
753500
5240
μ•Œκ³ λ¦¬μ¦˜μ„ 기반으둜 νŒλ‹¨μ„ ν•˜κ²Œ ν•΄μ€λ‹ˆλ‹€.
12:41
Yup, we really do this.
260
761340
1696
λ„€ μ €ν¬λŠ” μ§„μ§œλ‘œ μ΄λ ‡κ²Œ ν•©λ‹ˆλ‹€.
12:43
(Laughter)
261
763060
3296
(μ›ƒμŒ)
12:46
We do it because it eliminates
262
766380
2856
μš°λ¦¬κ°€ μ΄λ ‡κ²Œ ν•˜λŠ” μ΄μœ λŠ” 단지
12:49
what I believe to be one of the greatest tragedies of mankind,
263
769260
4456
μš°λ¦¬κ°€ μ΅œμ•…μ΄λΌκ³  λ―ΏλŠ” 상황을 μ œκ±°ν•˜κΈ° μœ„ν•¨μž…λ‹ˆλ‹€.
12:53
and that is people arrogantly,
264
773740
2160
그리고 κ·Έ 과정을 톡해 μ‚¬λžŒλ“€μ˜ κ±°λ§Œν•¨μ΄λ‚˜
12:56
naΓ―vely holding opinions in their minds that are wrong,
265
776580
4456
무지함이 λ§Œλ“  잘λͺ»λœ 관념과
13:01
and acting on them,
266
781060
1256
행동을 방지해주고
13:02
and not putting them out there to stress test them.
267
782340
2760
잘λͺ»λœ 것을 ν•˜λ„λ‘ ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
13:05
And that's a tragedy.
268
785820
1336
그것이야 말둜 비극이라 ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
13:07
And we do it because it elevates ourselves above our own opinions
269
787180
5416
λ˜ν•œ μš°λ¦¬λŠ” μš°λ¦¬κ°€ ν•œ 결정이 우리 μžμ‹ μ˜ 틀을 κ·Ήλ³΅ν•˜κ²Œ ν•˜κ³ 
13:12
so that we start to see things through everybody's eyes,
270
792620
2896
λ§ˆμΉ¨λ‚΄ μš°λ¦¬λŠ” "우리"의 λˆˆμ„ 톡해 세상을 보기 μ‹œμž‘ν•©λ‹ˆλ‹€.
13:15
and we see things collectively.
271
795540
1920
λ‹€ν•¨κ»˜ 세상을 보게 λ©λ‹ˆλ‹€.
13:18
Collective decision-making is so much better than individual decision-making
272
798180
4336
μ§‘λ‹¨μ˜μ‚¬κ²°μ •μ€ λ‹¨λ…μ˜μ‚¬ 결정보닀 더 μ•ˆμ „ν•©λ‹ˆλ‹€.
13:22
if it's done well.
273
802540
1200
단, μ •ν™•ν•œ 과정을 쑰건으둜 ν•©λ‹ˆλ‹€.
13:24
It's been the secret sauce behind our success.
274
804180
2616
이것이 μš°λ¦¬κ°€ μ„±κ³΅ν•œ λΉ„λ²•μž…λ‹ˆλ‹€.
13:26
It's why we've made more money for our clients
275
806820
2176
μ‹€μ œλ‘œ μš°λ¦¬λŠ” 더 λ§Žμ€ μˆ˜μ΅μ„ κ³ κ°μ—κ²Œ μ£Όμ—ˆμŠ΅λ‹ˆλ‹€.
13:29
than any other hedge fund in existence
276
809020
1936
이 세상에 μš΄μ˜λ˜λŠ” μ–΄λ–€ ν—€μ§€νŽ€λ“œλ³΄λ‹€λ„μš”.
13:30
and made money 23 out of the last 26 years.
277
810980
2720
그리고 μ§€λ‚œ 26λ…„ λ™μ•ˆ 23년이 ν‘μžμ˜€μŠ΅λ‹ˆλ‹€.
13:35
So what's the problem with being radically truthful
278
815700
4536
κ·Έλž˜μ„œ κ·Ήλ‹¨μ μœΌλ‘œ μ •μ§ν•˜κ³  투λͺ…ν•œ
13:40
and radically transparent with each other?
279
820260
2240
μ˜μ‚¬κ²°μ •μ— μ–΄λ–€ λ¬Έμ œκ°€ μžˆμ„κΉŒμš”?
13:45
People say it's emotionally difficult.
280
825220
2080
μ–΄λ–€ 이듀은 κ°μ •μ μœΌλ‘œ 받아듀이기 μ–΄λ ΅λ‹€κ³  ν•©λ‹ˆλ‹€.
13:48
Critics say it's a formula for a brutal work environment.
281
828060
4240
비평가듀은 μž”μΈν•œ 근무쑰건의 곡식일 뿐이라고 ν•˜κ² μ£ .
13:53
Neuroscientists tell me it has to do with how are brains are prewired.
282
833220
4856
μ‹ κ²½κ³Όν•™μžλ“€μ€ 이 방식은 우리 λ‡Œμ˜ μŠ΅κ΄€ν™”μ™€ 관계가 μžˆλ‹€κ³  ν•©λ‹ˆλ‹€.
13:58
There's a part of our brain that would like to know our mistakes
283
838100
3216
우리 λ‡Œμ—λŠ” μ‹€μˆ˜λ₯Ό 인지할 수 μžˆλŠ” μ˜μ—­μ΄ μ‘΄μž¬ν•©λ‹ˆλ‹€.
14:01
and like to look at our weaknesses so we could do better.
284
841340
3960
그리고 우리의 단점을 μΈμ§€ν•˜μ—¬μ„œ 후일에 κ°œμ„ μ„ ν•˜κΈ° μœ„ν•΄μ„œμž…λ‹ˆλ‹€.
14:05
I'm told that that's the prefrontal cortex.
285
845940
2440
κ·Έ μ˜μ—­μ΄ 전앑골 ν”Όμ§ˆμ΄λΌκ³  λ“€μ—ˆμŠ΅λ‹ˆλ‹€
14:08
And then there's a part of our brain which views all of this as attacks.
286
848860
4856
λ˜ν•œ μ΄λŸ¬ν•œ 인지사싀을 곡격으둜 κ°„μ£Όν•˜λŠ” μ˜μ—­μ΄ μžˆμŠ΅λ‹ˆλ‹€.
14:13
I'm told that that's the amygdala.
287
853740
1960
μ†Œλ‡Œ νŽΈλ„μ²΄λΆ€λΆ„μΈλ°μš”.
14:16
In other words, there are two you's inside you:
288
856260
3056
μ‰½κ²Œ μ„€λͺ…λ“œλ¦¬λ©΄ μ—¬λŸ¬λΆ„ μ•ˆμ— 두가지가 μ‘΄μž¬ν•œλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
14:19
there's an emotional you
289
859340
1416
ν•˜λ‚˜λŠ” κ°μ •μ˜ μ—¬λŸ¬λΆ„
14:20
and there's an intellectual you,
290
860780
1776
또 ν•˜λ‚˜λŠ” μ΄μ„±μ˜ μ—¬λŸ¬λΆ„μž…λ‹ˆλ‹€.
14:22
and often they're at odds,
291
862580
1776
이 λ‘˜μ€ ν•©μΉ˜κ°€ λ˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
14:24
and often they work against you.
292
864380
1920
μ’…μ’… 이 λ‘˜μ€ μ—¬λŸ¬λΆ„λ“€μ˜ 일을 λ°©ν•΄ν•©λ‹ˆλ‹€.
14:26
It's been our experience that we can win this battle.
293
866980
3736
μ§€κΈˆκΉŒμ§€ μ €ν¬λŠ” 이 μ‹Έμ›€μ—μ„œ 이기고 μžˆμŠ΅λ‹ˆλ‹€.
14:30
We win it as a group.
294
870740
1320
μš°λ¦¬λŠ” λ­‰μ³μ„œ 이기고 μžˆμŠ΅λ‹ˆλ‹€.
14:32
It takes about 18 months typically
295
872820
2336
보톡 18κ°œμ›” μ •λ„μ˜ μ‹œκ°„μ΄ μ§€λ‚˜μ•Ό
14:35
to find that most people prefer operating this way,
296
875180
3056
λŒ€λΆ€λΆ„μ˜ μ‚¬λžŒλ“€μ€ μ΄λŸ¬ν•œ 방식에 더 관심을 κ°€μ§€κ²Œ 되고
14:38
with this radical transparency
297
878260
2016
μ΅œλŒ€μΉ˜μ˜ 투λͺ…성이 μ‘΄μž¬ν•˜λŠ” 이 방법이
14:40
than to be operating in a more opaque environment.
298
880300
3336
뢈투λͺ…ν•œ ν™˜κ²½λ³΄λ‹€ 더 낫닀라고 μΈμ‹ν•˜κ²Œ λ©λ‹ˆλ‹€.
14:43
There's not politics, there's not the brutality of --
299
883660
4296
μ •μΉ˜μ²˜λŸΌ 남을 속이지도 μ•Šκ³  λ‚˜λ§Œ μ‚΄μžλŠ” μž”μΈν•¨, 그리고
14:47
you know, all of that hidden, behind-the-scenes --
300
887980
2376
κ·Έ λ­λž„κΉŒ, λ°€μ‹€μ—μ„œ μ§œκ³ μΉ˜λŠ” 그런 일듀이 μ—†μŠ΅λ‹ˆλ‹€.
14:50
there's an idea meritocracy where people can speak up.
301
890380
2936
집단지성체에선 λͺ¨λ“  이듀이 μžμ‹ μ˜ 생각을 μ£Όμž₯ν•  수 μžˆμŠ΅λ‹ˆλ‹€.
14:53
And that's been great.
302
893340
1256
이것은 μœ„λŒ€ν•œ κ²½ν—˜μ΄μ—ˆμŠ΅λ‹ˆλ‹€.
14:54
It's given us more effective work,
303
894620
1656
이것은 μš°λ¦¬κ°€ 더 효율적으둜 일을 ν•˜κ²Œ ν•˜κ³ 
14:56
and it's given us more effective relationships.
304
896300
2400
μš°λ¦¬μ—κ²Œ 더 λ‚˜μ€ 관계λ₯Ό 맺도둝 ν•΄μ£Όμ—ˆμŠ΅λ‹ˆλ‹€.
14:59
But it's not for everybody.
305
899220
1320
ν•˜μ§€λ§Œ, μ „λΆ€λŠ” μ•„λ‹ˆμ—ˆμŠ΅λ‹ˆλ‹€.
15:01
We found something like 25 or 30 percent of the population
306
901500
2936
μš°λ¦¬λŠ” 25μ—μ„œ 30νΌμ„ΌνŠΈμ˜
15:04
it's just not for.
307
904460
1736
λ°˜λŒ€μ˜κ²¬μ„ κ³ μ§‘ν•˜λŠ” μ‚¬λžŒλ“€μ΄ μžˆλ‹€λŠ” 것을 μ•Œκ²Œ λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
15:06
And by the way,
308
906220
1216
μ–΄μ¨Œλ“ 
15:07
when I say radical transparency,
309
907460
1816
μ œκ°€ λ§μ”€λ“œλ¦° μ΅œλŒ€μ˜ 투λͺ…μ„±μ΄λž€
15:09
I'm not saying transparency about everything.
310
909300
2336
λͺ¨λ“  것이 투λͺ…ν•΄μ•Ό ν•œλ‹€λŠ” μΌλ°˜ν™”κ°€ μ•„λ‹ˆλΌλŠ” 것을 λ§μ”€λ“œλ¦½λ‹ˆλ‹€.
15:11
I mean, you don't have to tell somebody that their bald spot is growing
311
911660
3816
예λ₯Ό λ“€μ–΄ 머리가 벗겨지고 μžˆλŠ” μΉœκ΅¬ν•œν…Œ κ·Έ 사싀을 말해야 ν•œλ‹€κ±°λ‚˜
15:15
or their baby's ugly.
312
915500
1616
μ•„κΈ°κ°€ λͺ» 생겼닀고 μ†”μ§ν•˜κ²Œ 말할 ν•„μš”λŠ” μ—†λ‹€λŠ” κ²λ‹ˆλ‹€.
15:17
So, I'm just talking about --
313
917140
2096
제 μš”μ μ€
15:19
(Laughter)
314
919260
1216
(μ›ƒμŒ)
15:20
talking about the important things.
315
920500
2176
μ€‘μš”ν•œ 일에 λŒ€ν•œ 투λͺ…μ„±μž…λ‹ˆλ‹€.
15:22
So --
316
922700
1216
κ·Έλž˜μ„œ
15:23
(Laughter)
317
923940
3200
(μ›ƒμŒ)
15:28
So when you leave this room,
318
928420
1416
μ—¬λŸ¬λΆ„λ“€μ΄ λ‚˜κ°€μ‹€ λ•Œ
15:29
I'd like you to observe yourself in conversations with others.
319
929860
4440
λ‹€λ₯Έ μ‚¬λžŒλ“€κ³Ό μ–˜κΈ°λ₯Ό λ‚˜λˆ„μ‹œλŠ” μžμ‹ μ„ μ‚΄νŽ΄λ³΄μ„Έμš”.
15:35
Imagine if you knew what they were really thinking,
320
935180
3680
μƒλŒ€λ°©μ˜ 생각을 μ—¬λŸ¬λΆ„μ΄ μ•Œκ³  μžˆλ‹€κ³  μƒμƒν•΄λ³΄μ„Έμš”.
15:39
and imagine if you knew what they were really like ...
321
939580
2600
κ·Έλ“€μ˜ 본래 λͺ¨μŠ΅λ„ μ•Œκ³  μžˆλ‹€κ³  μƒμƒν•΄λ³΄μ‹œκ³ ,
15:43
and imagine if they knew what you were really thinking
322
943660
3976
그듀이 μ—¬λŸ¬λΆ„μ˜ 생각과 μ‹€μ œ λͺ¨μŠ΅λ„
15:47
and what were really like.
323
947660
1840
μ•Œκ³  μžˆλ‹€κ³  κ°€μ •ν•΄λ³΄μ„Έμš”.
15:49
It would certainly clear things up a lot
324
949980
2576
이 것은 λͺ¨λ“  것듀은 μ„ λͺ…ν•˜κ²Œ ν•©λ‹ˆλ‹€.
15:52
and make your operations together more effective.
325
952580
2856
그리고 ν˜„μž¬ ν•˜κ³  계신 일이 λ”μš± νš¨μœ¨μ„±μ„ κ°€μ§€κ²Œ λ©λ‹ˆλ‹€.
15:55
I think it will improve your relationships.
326
955460
2240
μ €λŠ” μ΄λ ‡κ²Œ ν•˜λ©΄ μ‚¬νšŒμ  관계도 ν–₯상될것이라고 ν™•μ‹ ν•©λ‹ˆλ‹€.
15:58
Now imagine that you can have algorithms
327
958420
3296
이제 μ—¬λŸ¬λΆ„λ“€μ΄ μ•Œκ³ λ¦¬μ¦˜μ„ 가지고 μžˆλ‹€κ³  상상해 λ³΄μ„Έμš”
16:01
that will help you gather all of that information
328
961740
3816
μ—¬λŸ¬λΆ„λ“€μ΄ 정보λ₯Ό λͺ¨μœΌλŠ”데 도움이 λ˜λŠ” μ•Œκ³ λ¦¬μ¦˜μ„μš”.
16:05
and even help you make decisions in an idea-meritocratic way.
329
965580
4560
심지어 μ§‘λ‹¨μ§€μ„±μ˜ 힘처럼 μ˜μ‚¬κ²°μ •μ˜ νž˜μ„ κ°€μ§€κ²Œ λ κ²ƒμž…λ‹ˆλ‹€.
16:12
This sort of radical transparency is coming at you
330
972460
4336
λ°”λ‘œ 이런 졜고의 투λͺ…성이 μš”κ΅¬λ˜λŠ” μ‹œλŒ€κ°€ μ‹œμž‘λ˜κ³  μžˆμŠ΅λ‹ˆλ‹€.
16:16
and it is going to affect your life.
331
976820
1960
이것은 μ—¬λŸ¬λΆ„μ˜ 삢에 영ν–₯을 쀄 κ²ƒμž…λ‹ˆλ‹€.
16:19
And in my opinion,
332
979420
2056
제 μƒκ°μœΌλ‘œλŠ”
16:21
it's going to be wonderful.
333
981500
1336
μ•„λ§ˆ μ—¬λŸ¬λΆ„λ“€μ€ λ‹€ μ„±κ³΅ν•˜μ‹€ κ²λ‹ˆλ‹€.
16:22
So I hope it is as wonderful for you
334
982860
2336
μ—¬λŸ¬λΆ„λ“€μ΄ λ‹€ 잘 λ˜μ‹œκΈΈ λ°”λžλ‹ˆλ‹€.
16:25
as it is for me.
335
985220
1200
저도 λ§ˆμ°¬κ°€μ§€μž…λ‹ˆλ‹€.
16:26
Thank you very much.
336
986980
1256
κ°μ‚¬ν•©λ‹ˆλ‹€.
16:28
(Applause)
337
988260
4360
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7