How algorithms shape our world | Kevin Slavin

483,226 views ・ 2011-07-21

TED


μ•„λž˜ μ˜λ¬Έμžλ§‰μ„ λ”λΈ”ν΄λ¦­ν•˜μ‹œλ©΄ μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€.

λ²ˆμ—­: Sue J. Hur κ²€ν† : Young-ho Park
00:15
This is a photograph
0
15260
2000
이것은 마이클 λ‚˜μžλΌλŠ” μ˜ˆμˆ κ°€μ˜
00:17
by the artist Michael Najjar,
1
17260
2000
μž‘ν’ˆμΈλ° κ·Έκ°€ μ‹€μ œλ‘œ μ•„λ₯΄ν—¨ν‹°λ‚˜μ—
00:19
and it's real,
2
19260
2000
κ°€μ„œ 찍은 μ‚¬μ§„μ΄λ‹ˆκΉŒ
00:21
in the sense that he went there to Argentina
3
21260
2000
μ‹€μ œ 사진이라고도
00:23
to take the photo.
4
23260
2000
말할 수 있죠.
00:25
But it's also a fiction. There's a lot of work that went into it after that.
5
25260
3000
ν•˜μ§€λ§Œ, 이 사진은 λ§Žμ€ μž‘μ—…μ„ ν•œ 사진이기 λ•Œλ¬Έμ— κ°€μ§œ 사진이라고 말할 μˆ˜λ„ 있죠.
00:28
And what he's done
6
28260
2000
이것은 사싀 디지털 μž‘μ—…μ„ 톡해
00:30
is he's actually reshaped, digitally,
7
30260
2000
λ‹€μš°μ‘΄μŠ€ μ§€μˆ˜κ°€ λ³€λ™ν•œ
00:32
all of the contours of the mountains
8
32260
2000
λͺ¨μ–‘에 따라 μ‚°μ˜ 등고선을
00:34
to follow the vicissitudes of the Dow Jones index.
9
34260
3000
μ‘°μž‘ν•œ μ‚¬μ§„μ΄μ§€μš”.
00:37
So what you see,
10
37260
2000
여기에 λ³΄μ΄λŠ” κ³„κ³‘μ˜
00:39
that precipice, that high precipice with the valley,
11
39260
2000
높은 μ ˆλ²½μ€
00:41
is the 2008 financial crisis.
12
41260
2000
2008λ…„ κΈˆμœ΅μœ„κΈ°λ₯Ό λ‚˜νƒ€λ‚΄μ£ .
00:43
The photo was made
13
43260
2000
이 사진은 우리의 κ²½μ œκ°€ μ € κΉŠμˆ™ν•œ
00:45
when we were deep in the valley over there.
14
45260
2000
κ³„κ³‘μ•ˆμ— μžˆμ—ˆμ„λ•Œ μ°μ€κ²ƒμ΄μ§€μš”.
00:47
I don't know where we are now.
15
47260
2000
μš”μ¦˜ μš”μ¦˜ λ‹€μš°μ‘΄μŠ€λŠ” 어떀지 λͺ¨λ₯΄κ² μŠ΅λ‹ˆλ‹€.
00:49
This is the Hang Seng index
16
49260
2000
이 사진은 홍콩
00:51
for Hong Kong.
17
51260
2000
ν•­μƒμ§€μˆ˜μΈλ°
00:53
And similar topography.
18
53260
2000
전체적 양상이 λΉ„μŠ·ν•˜λ„€μš”.
00:55
I wonder why.
19
55260
2000
μ™œ κ·ΈλŸ΄κΉŒμš”?
00:57
And this is art. This is metaphor.
20
57260
3000
이것은 예술인 λ™μ‹œμ— 메타포죠.
01:00
But I think the point is
21
60260
2000
그런데 이 λ©”νƒ€ν¬μ—λŠ” μ•Œκ³ λ¦¬μ¦˜ μžμ²΄κ°€
01:02
that this is metaphor with teeth,
22
62260
2000
싀세계에 영ν–₯을 λ―ΈμΉœλ‹€λŠ” μ˜λ―Έκ°€ μžˆμ§€μš”.
01:04
and it's with those teeth that I want to propose today
23
64260
3000
μ•Œκ³ λ¦¬μ¦˜μ€ 싀세계에 영ν–₯을 미치기 λ•Œλ¬Έμ— μ €λŠ”
01:07
that we rethink a little bit
24
67260
2000
κΈˆμœ΅μˆ˜ν•™ 같은 ν˜„λŒ€ μˆ˜ν•™ 뿐만 μ•„λ‹ˆλΌ
01:09
about the role of contemporary math --
25
69260
3000
일반적인 μˆ˜ν•™μ˜ 역할도
01:12
not just financial math, but math in general.
26
72260
3000
μž¬κ³ ν•  것을 μ €λŠ” μ—¬λŸ¬λΆ„κ»˜ κ±΄μ˜ν•©λ‹ˆλ‹€.
01:15
That its transition
27
75260
2000
μ €λŠ” λ˜ν•œ μš°λ¦¬κ°€ μ„Έκ³„λ‘œ λΆ€ν„° μΆ”μΆœν•˜κ³ 
01:17
from being something that we extract and derive from the world
28
77260
3000
μœ λ„ν•΄ λ‚΄λŠ” 데이터가 우리의 μƒν™œκ³Ό
01:20
to something that actually starts to shape it --
29
80260
3000
우리λ₯Ό λ‘˜λŸ¬μ‹ΈλŠ” 세계λ₯Ό
01:23
the world around us and the world inside us.
30
83260
3000
μ‹€μ§€λ‘œ λ§Œλ“œλŠ” 전이 κ³Όμ •, 그리고
01:26
And it's specifically algorithms,
31
86260
2000
컴퓨터가 결정을 내릴 λ•Œ μ‚¬μš©λ˜λŠ” μˆ˜ν•™,
01:28
which are basically the math
32
88260
2000
특히 μ•Œκ³ λ¦¬μ¦˜μ— λŒ€ν•΄
01:30
that computers use to decide stuff.
33
90260
3000
λ§μ”€λ“œλ¦¬κ³ μž ν•©λ‹ˆλ‹€.
01:33
They acquire the sensibility of truth
34
93260
2000
μ»΄ν“¨ν„°μ˜ μ•Œκ³ λ¦¬μ¦˜μ€ 계속 반볡적으둜
01:35
because they repeat over and over again,
35
95260
2000
μ‚¬μš©λ˜κΈ° λ•Œλ¬Έμ— 그듀이 μ˜³λ‹€λŠ” 인상을 μ£Όλ©°
01:37
and they ossify and calcify,
36
97260
3000
μ‹œκ°„μ΄ 흐름에 따라 그듀에 λŒ€ν•œ
01:40
and they become real.
37
100260
2000
신뒰도가 더 μ¦κ°€ν•˜κ³  κ²°κ΅­μ—λŠ” 진싀이 되죠.
01:42
And I was thinking about this, of all places,
38
102260
3000
μ €λŠ” ν•œ 2년전에 λŒ€μ„œμ–‘μ„ κ±΄λ„ˆλŠ” λΉ„ν–‰κΈ° μ•ˆμ—μ„œ
01:45
on a transatlantic flight a couple of years ago,
39
105260
3000
λ°”λ‘œ 이런 생각을 ν•˜κ³ μžˆμ—ˆλŠ”λ°
01:48
because I happened to be seated
40
108260
2000
마침 제 μ˜†μ— 제 λ‚˜μ΄ μ •λ„λ˜λŠ”
01:50
next to a Hungarian physicist about my age
41
110260
2000
헝가리인 λ¬Όλ¦¬ν•™μžκ°€ μ•‰μ•˜μ—ˆμ§€μš”.
01:52
and we were talking
42
112260
2000
μš°λ¦¬λŠ” 냉전 λ‹Ήμ‹œ ν—κ°€λ¦¬μ—μ„œ
01:54
about what life was like during the Cold War
43
114260
2000
λ¬Όλ¦¬ν•™μžλ‘œ μΌν•˜λŠ”κ²Œ
01:56
for physicists in Hungary.
44
116260
2000
μ–΄λ• λŠ”κ°€ ν•˜λŠ” λŒ€ν™”λ₯Ό λ‚˜λˆ΄λŠ”λ°
01:58
And I said, "So what were you doing?"
45
118260
2000
κ·Έμ‚¬λžŒμ—κ²Œ λ¬΄μŠ¨μΌμ„ ν–ˆλƒκ³  λ¬Όμ—ˆλ”λ‹ˆ
02:00
And he said, "Well we were mostly breaking stealth."
46
120260
2000
주둜 μŠ€ν…”μŠ€λ₯Ό μž‘λŠ” 일을 ν–ˆλ‹€κ³  ν•˜λ”κ΅°μš”.
02:02
And I said, "That's a good job. That's interesting.
47
122260
2000
κ·Έλž˜μ„œ μ €λŠ” ν₯λ―ΈμžˆλŠ” 직업같닀고 λ§ν–ˆμ£ .
02:04
How does that work?"
48
124260
2000
그런데 그건 μ–΄λ–€ κΈ°μˆ μΈκ°€μš”?
02:06
And to understand that,
49
126260
2000
κ·Έ κΈ°μˆ μ„ μ΄ν•΄ν•˜λ €λ©΄
02:08
you have to understand a little bit about how stealth works.
50
128260
3000
μŠ€ν…”μŠ€μ˜ μž‘λ™μ›λ¦¬λ₯Ό μ’€ μ•Œμ•„μ•Ό ν•˜μ£ .
02:11
And so -- this is an over-simplification --
51
131260
3000
μ•„μ£Ό κ°„λ‹¨νžˆ λ§ν•˜μžλ©΄ λ ˆμ΄λ” μ‹ ν˜Έκ°€
02:14
but basically, it's not like
52
134260
2000
ν•˜λŠ˜μ— 뜬 156ν†€μ˜ μ²  덩어리λ₯Ό
02:16
you can just pass a radar signal
53
136260
2000
뚫고 λ‚˜κ°€κ±°λ‚˜
02:18
right through 156 tons of steel in the sky.
54
138260
3000
λ˜λŠ” κ·Έλƒ₯ μ‚¬λΌμ§€κ²Œ
02:21
It's not just going to disappear.
55
141260
3000
λ§Œλ“€ μˆ˜λŠ” μ—†μ£ .
02:24
But if you can take this big, massive thing,
56
144260
3000
κ·Έλ ‡μ§€λ§Œ μ΄λ ‡κ²Œ 큰 λΉ„ν–‰κΈ°λ₯Ό 수만개의
02:27
and you could turn it into
57
147260
3000
μž‘μ€ 쑰각으둜 μͺΌκ°€ 수 μžˆλ‹€λ©΄
02:30
a million little things --
58
150260
2000
- 예λ₯Όλ“€λ©΄ μƒˆ λ–Ό 같은 κ²ƒμœΌλ‘œμš” -
02:32
something like a flock of birds --
59
152260
2000
이런 λΉ„ν–‰κΈ°λ₯Ό μΆ”μ ν•˜λŠ”
02:34
well then the radar that's looking for that
60
154260
2000
λ ˆμ΄λ‹€λŠ” λ‚ λΌλ‹€λ‹ˆλŠ”
02:36
has to be able to see
61
156260
2000
λͺ¨λ“  μƒˆ 떼듀을
02:38
every flock of birds in the sky.
62
158260
2000
λ‹€ λ³Ό 수 μžˆμ–΄μ•Ό κ² μ§€μš”.
02:40
And if you're a radar, that's a really bad job.
63
160260
4000
λ ˆμ΄λ”λ‘œ 그런 일을 ν•˜λŠ”κ²ƒμ€ μ°Έ μ–΄λ ΅μ£ .
02:44
And he said, "Yeah." He said, "But that's if you're a radar.
64
164260
3000
그리고 κ·Έ 헝가리 λ¬Όλ¦¬ν•™μžκ°€ λ§ν–ˆμ£ ,
02:47
So we didn't use a radar;
65
167260
2000
"κ·Έλž˜μ„œ μš°λ¦¬λŠ” λ ˆμ΄λ”λ₯Ό λ§Œλ“€μ§€ μ•Šκ³ 
02:49
we built a black box that was looking for electrical signals,
66
169260
3000
μ „κΈ°ν†΅μ‹ μ΄λ‚˜ μ „κΈ°μ‹ ν˜Έλ₯Ό κ°μ§€ν•˜λŠ”
02:52
electronic communication.
67
172260
3000
λΈ”λž™λ°•μŠ€λ₯Ό λ§Œλ“€μ—ˆμ£ . κ·Έλ¦¬κ³ λŠ”,
02:55
And whenever we saw a flock of birds that had electronic communication,
68
175260
3000
μƒˆ 떼듀이 전기톡신을 ν• λ•Œ λ§ˆλ‹€ 미ꡭ인듀과
02:58
we thought, 'Probably has something to do with the Americans.'"
69
178260
3000
무슨 연관이 μžˆμ„μ§€ λͺ¨λ₯Έλ‹€κ³  μƒκ°ν–ˆμ£ ".
03:01
And I said, "Yeah.
70
181260
2000
κ·Έλž˜μ„œ μ œκ°€ 그랬죠.
03:03
That's good.
71
183260
2000
"μ’‹κ΅°μš”.
03:05
So you've effectively negated
72
185260
2000
당신넀듀이 60λ…„κ°„μ˜ 항곡학
03:07
60 years of aeronautic research.
73
187260
2000
연ꡬλ₯Ό λ¬΄νš¨ν™”ν•œ μ…ˆμ΄λ„€μš”.
03:09
What's your act two?
74
189260
2000
그럼 λ‹Ήμ‹ μ˜ 제2μž₯은 뭐죠?
03:11
What do you do when you grow up?"
75
191260
2000
그럼 μ§€κΈˆμ€ λ¬΄μŠ¨μΌμ„ ν•˜μ„Έμš”?"
03:13
And he said,
76
193260
2000
κ·Έκ°€ μ΄λ ‡κ²Œ λ‹΅ν–ˆμ£ ,
03:15
"Well, financial services."
77
195260
2000
"κΈˆμœ΅μ„œλΉ„μŠ€μ—μ„œ μΌν•΄μš”".
03:17
And I said, "Oh."
78
197260
2000
근데 κ·Έ λ‹Ήμ‹œμ— 그런 이야기듀이
03:19
Because those had been in the news lately.
79
199260
3000
λ‰΄μŠ€μ— λ³΄λ„λ˜κ³  μžˆμ—ˆκΈ° λ•Œλ¬Έμ— μ œκ°€ λ¬Όμ—ˆμ£ .
03:22
And I said, "How does that work?"
80
202260
2000
"그게 λ­ν•˜λŠ”κ±°μ£ ?"
03:24
And he said, "Well there's 2,000 physicists on Wall Street now,
81
204260
2000
κ·Έκ°€ λ§ν•˜κΈΈ, "μ›”μŠ€νŠΈλ¦¬νŠΈμ—” 저같은
03:26
and I'm one of them."
82
206260
2000
λ¬Όλ¦¬ν•™μžλ“€μ΄ μ•½ 2000λͺ… μΌν•˜κ³  있죠".
03:28
And I said, "What's the black box for Wall Street?"
83
208260
3000
μ œκ°€ λ¬Όμ—ˆμ£ . "μ›”μŠ€νŠΈλ¦¬νŠΈμ˜ λΈ”λž™λ°•μŠ€κ°€ λ­‘λ‹ˆκΉŒ?"
03:31
And he said, "It's funny you ask that,
84
211260
2000
κ·Έκ°€ λ§ν•˜κΈΈ, "λ§κ·ΈλŒ€λ‘œ λΈ”λž™λ°•μŠ€μ˜ˆμš”.
03:33
because it's actually called black box trading.
85
213260
3000
사싀 λΈ”λž™λ°•μŠ€ νŠΈλ ˆμ΄λ”©μ΄λΌκ³  λ§ν•˜μ£ .
03:36
And it's also sometimes called algo trading,
86
216260
2000
μ–΄λ–€ μ‚¬λžŒμ€ μ•Œκ³  νŠΈλ ˆμ΄λ”©, 즉
03:38
algorithmic trading."
87
218260
3000
μ•Œκ³ λ¦¬μ¦ˆλ―Ή νŠΈλ ˆμ΄λ”©μ΄λΌκ³  ν•˜μ£ ".
03:41
And algorithmic trading evolved in part
88
221260
3000
μ•Œκ³ λ¦¬μ¦ˆλ―Ή νŠΈλ ˆμ΄λ”©μ΄ λ°œλ‹¬λ˜κ²Œ 된 이유의 ν•˜λ‚˜λŠ”
03:44
because institutional traders have the same problems
89
224260
3000
μ›”μŠ€νŠΈλ¦¬νŠΈμ˜ κΈ°κ΄€νˆ¬μžμžλ“€μ΄ λ―Έκ΅­ 곡ꡰ이 κ°€μ‘Œλ˜
03:47
that the United States Air Force had,
90
227260
3000
λ˜‘ 같은 문제λ₯Ό 가지고 있기 λ•Œλ¬Έμ΄μ—ˆλŠ”λ° -
03:50
which is that they're moving these positions --
91
230260
3000
그듀은, 예λ₯Όλ“€λ©΄ 프둝터 μ•€ κ°¬λΈ”
03:53
whether it's Proctor & Gamble or Accenture, whatever --
92
233260
2000
λ˜λŠ” 앑센쳐 같은,
03:55
they're moving a million shares of something
93
235260
2000
νšŒμ‚¬λ“€μ˜ μˆ˜λ§Žμ€
03:57
through the market.
94
237260
2000
주식을 사고 νŒ”μ§€μš”.
03:59
And if they do that all at once,
95
239260
2000
그런데 이듀이 μ£Όμ‹κ±°λž˜λ₯Ό ν•œκΊΌλ²ˆμ— ν•œλ‹€λ©΄
04:01
it's like playing poker and going all in right away.
96
241260
2000
포컀λ₯Ό ν• λ•Œ μ˜¬μΈν•˜λŠ”κ±°λ‚˜ λ§ˆμ°¬κ°€μ§€μ£ .
04:03
You just tip your hand.
97
243260
2000
μžˆλŠ” λˆμ„ ν•œκΊΌλ²ˆμ— λ‹€ κ±°λŠ” κ±°λ‹ˆκΉŒμš”.
04:05
And so they have to find a way --
98
245260
2000
κ·Έλž˜μ„œ 그듀은 큰 κΈˆμ•‘μ„
04:07
and they use algorithms to do this --
99
247260
2000
ν•œκΊΌλ²ˆμ— κ±°λž˜ν•˜μ§€ μ•Šκ³ ,
04:09
to break up that big thing
100
249260
2000
λ§Žμ€ 수의 적은 κΈˆμ•‘μœΌλ‘œ λ‚˜λˆ μ„œ 거래λ₯Ό ν•˜λŠ”λ°
04:11
into a million little transactions.
101
251260
2000
κ·ΈλŸ΄λ•Œ μ•Œκ³ λ¦¬μ¦˜μ„ μ‚¬μš©ν•˜μ£ .
04:13
And the magic and the horror of that
102
253260
2000
그런데 ν•œκ°€μ§€ ν₯λ―ΈμžˆλŠ” λ™μ‹œμ— λ‘λ €μš΄ 사싀은
04:15
is that the same math
103
255260
2000
큰 κΈˆμ•‘μ˜ 거래λ₯Ό λ§Žμ€ μ—¬λŸ¬κ°œμ˜ μž‘μ€ 거래둜 갈λ₯Όλ•Œ
04:17
that you use to break up the big thing
104
257260
2000
μ‚¬μš©ν•œ λ˜‘κ°™μ€ μˆ˜ν•™λ°©μ‹μ„ μ‚¬μš©ν•˜λ©΄
04:19
into a million little things
105
259260
2000
μˆ˜λ§Žμ€ μž‘μ€ κΈˆμ•‘μ˜ κ±°λž˜λ“€μ„ μ›λž˜μ˜
04:21
can be used to find a million little things
106
261260
2000
큰 κΈˆμ•‘μœΌλ‘œ λ‹€μ‹œ μ‘°λ¦½μ‹œμΌœμ„œ
04:23
and sew them back together
107
263260
2000
주식 μ‹œμž₯μ—μ„œ μ‹€μ§€λ‘œ 무슨일이 λ²Œμ–΄μ§€κ³  μžˆλŠ”μ§€
04:25
and figure out what's actually happening in the market.
108
265260
2000
μ•Œμ•„λ‚Ό 수 μžˆλ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€.
04:27
So if you need to have some image
109
267260
2000
즉, μ§€κΈˆ μ£Όμ‹μ‹œμž₯이 μ–΄λ–»κ²Œ λŒμ•„κ°€λŠ”μ§€ μ΄ν•΄ν•˜λ €λ©΄
04:29
of what's happening in the stock market right now,
110
269260
3000
μ£Όμ‹κ±°λž˜λ₯Ό μˆ¨κΈ°λŠ” λͺ©μ μ˜ μ•Œκ³ λ¦¬μ¦˜μ΄ μžˆλŠ”κ°€ ν•˜λ©΄,
04:32
what you can picture is a bunch of algorithms
111
272260
2000
κ·Έμ™€λŠ” μ •λ°˜λŒ€λ‘œ μˆ¨κ²¨μ§„ κ±°λž˜λ“€μ„ μ°Ύμ•„λ‚΄μ„œ
04:34
that are basically programmed to hide,
112
274260
3000
μ μ ˆν•œ 쑰치λ₯Ό μ·¨ν•˜λŠ” 것을 λͺ©μ μœΌλ‘œ ν”„λ‘œκ·Έλž¨λœ
04:37
and a bunch of algorithms that are programmed to go find them and act.
113
277260
3000
μ—¬λŸ¬κ°œμ˜ μ•Œκ³ λ¦¬μ¦˜μ΄ μžˆλ‹€λŠ” 사싀을 μ•Œμ•„μ•Ό ν•©λ‹ˆλ‹€.
04:40
And all of that's great, and it's fine.
114
280260
3000
자, μ—¬κΈ°κΉŒμ§€λŠ” 뭐 큰 λ¬Έμ œκ°€ μ—†μ£ .
04:43
And that's 70 percent
115
283260
2000
κ·Έλ ‡μ§€λ§Œ, 이런 κΈ°κ΄€νˆ¬μžμžλ“€μ˜
04:45
of the United States stock market,
116
285260
2000
κ±°λž˜μ•‘μ΄ 총 μ¦μ‹œκ±°λž˜μ•‘μ˜
04:47
70 percent of the operating system
117
287260
2000
70%에 λ‹¬ν•˜λŠ”λ°
04:49
formerly known as your pension,
118
289260
3000
이게 λ‹€ μ—¬λŸ¬λΆ„μ˜
04:52
your mortgage.
119
292260
3000
μ—°κΈˆμ΄κ³  λͺ¨κ²Œμ§€μ£ .
04:55
And what could go wrong?
120
295260
2000
μ΄κ²ƒμ²˜λŸΌ μ•ˆμ „ν•œκ²Œ μžˆκ² μ–΄μš”?
04:57
What could go wrong
121
297260
2000
그런데 1λ…„ 전에 μ „ μ¦μ‹œ κ±°λž˜μ•‘μ˜
04:59
is that a year ago,
122
299260
2000
9%κ°€ 5λΆ„ λ™μ•ˆμ— κ°‘μžκΈ°
05:01
nine percent of the entire market just disappears in five minutes,
123
301260
3000
μ‚¬λΌμ§€λŠ” μ‚¬νƒœκ°€ λ²Œμ–΄μ‘Œμ—ˆλŠ”λ°
05:04
and they called it the Flash Crash of 2:45.
124
304260
3000
그듀은 κ·Έ 사건을 '2:45 ν”Œλž˜μ‹œ ν¬λž˜μ‰¬'라고 λΆ€λ₯΄μ£ .
05:07
All of a sudden, nine percent just goes away,
125
307260
3000
그런 μ‚¬νƒœκ°€ λ²Œμ–΄μ§ˆ λ§Œν•œ μ£Όμ‹κ±°λž˜λ₯Ό
05:10
and nobody to this day
126
310260
2000
ν–ˆλ˜ μ‚¬λžŒλ„ μ—†κ³  μš”μ²­ν•œ μ‚¬λžŒλ„ μ—†μ—ˆλŠ”λ°
05:12
can even agree on what happened
127
312260
2000
μ™œ 그런 μ‚¬νƒœκ°€ λ°œμƒν–ˆμ—ˆλŠ”μ§€ μ•„μ§κΉŒμ§€λ„ μ „λ¬Έκ°€λ“€ 끼리
05:14
because nobody ordered it, nobody asked for it.
128
314260
3000
κ·Έ μ‚¬κ±΄μ˜ λ°œμƒ 원인에 λŒ€ν•΄ λ™μ˜μ‘°μ°¨ λͺ»ν•˜κ³  있죠.
05:17
Nobody had any control over what was actually happening.
129
317260
3000
μ‹€μ œλ‘œ λ²Œμ–΄μ§€λŠ” 상황을 아무도 컨트둀 ν•  수 μ—†μ—ˆμ£ .
05:20
All they had
130
320260
2000
κΈ°κ΄€νˆ¬μžμžλ“€μ΄ κ°€μ‘Œλ˜ μ»¨νŠΈλ‘€μ΄λΌκ³ λŠ”
05:22
was just a monitor in front of them
131
322260
2000
λͺ¨λ‹ˆν„°μ— ν‘œμ‹œλ˜λŠ” μˆ«μžλ“€κ³Ό
05:24
that had the numbers on it
132
324260
2000
'μŠ€ν†±'μ΄λΌλŠ” 글이 적힌
05:26
and just a red button
133
326260
2000
λΉ¨κ°„ λ²„νŠΌ 밖에
05:28
that said, "Stop."
134
328260
2000
μ—†μ—ˆμœΌλ‹ˆκΉŒμš”.
05:30
And that's the thing,
135
330260
2000
κ·ΈλŸ¬λ‹ˆκΉŒ μš°λ¦¬λŠ” μš°λ¦¬κ°€
05:32
is that we're writing things,
136
332260
2000
더 이상 읽을 수 μ—†κ²Œ
05:34
we're writing these things that we can no longer read.
137
334260
3000
된 것 듀을 μ“°κ³  μžˆμ—ˆλ‹€λŠ” κ±°μ£ .
05:37
And we've rendered something
138
337260
2000
μš°λ¦¬κ°€ κ·Έ 무엇인가λ₯Ό 더 이상
05:39
illegible,
139
339260
2000
읽을 수 μ—†κ²Œ λ§Œλ“€μ—ˆλ‹€λŠ” 것이죠.
05:41
and we've lost the sense
140
341260
3000
κ·Έλž˜μ„œ μš°λ¦¬λŠ” μš°λ¦¬κ°€ λ§Œλ“ 
05:44
of what's actually happening
141
344260
2000
이 μ„Έκ³„μ—μ„œ μ‹€μ§€λ‘œ 무슨일이 λ°œμƒν•˜κ³  μžˆλŠ”μ§€λ₯Ό
05:46
in this world that we've made.
142
346260
2000
λͺ¨λ₯΄κ²Œ λμ§€μš”.
05:48
And we're starting to make our way.
143
348260
2000
κ·ΈλŸ¬λ‚˜ 인젠 ν˜•νŽΈμ΄ λ‚˜μ•„μ§€κΈ° μ‹œμž‘ν–ˆμ£ .
05:50
There's a company in Boston called Nanex,
144
350260
3000
λ³΄μŠ€ν„΄μ— λ‚˜λ„₯μŠ€λΌλŠ” νšŒμ‚¬κ°€ μžˆλŠ”λ°,
05:53
and they use math and magic
145
353260
2000
κ·Έ νšŒμ‚¬λŠ” μˆ˜ν•™κ³Ό μ œκ°€ λͺ¨λ₯΄λŠ” μ–΄λ–€ λ§ˆμˆ μ„ μ‚¬μš©ν•΄μ„œ
05:55
and I don't know what,
146
355260
2000
μ¦κΆŒμ‹œμž₯의 λͺ¨λ“  데이터λ₯Ό 읽고
05:57
and they reach into all the market data
147
357260
2000
항상 κ·ΈλŸ°κ²ƒμ€ μ•„λ‹ˆμ§€λ§Œ
05:59
and they find, actually sometimes, some of these algorithms.
148
359260
3000
가끔 이런 μ•Œκ³ λ¦¬μ¦˜λ“€μ˜ 일뢀λ₯Ό μ°Ύμ•„λ‚΄μ§€μš”.
06:02
And when they find them they pull them out
149
362260
3000
그듀은 그런 μ•Œκ³ λ¦¬μ¦˜μ„ ν•˜λ‚˜ 찾으면
06:05
and they pin them to the wall like butterflies.
150
365260
3000
κ·Έκ±Έ 쒅이에 μ μ–΄μ„œ λ‚˜λΉ„μ²˜λŸΌ 벽에 λΆ™μ΄μ§€μš”.
06:08
And they do what we've always done
151
368260
2000
μš°λ¦¬λŠ” μ΄ν•΄ν•˜μ§€ λͺ»ν•˜λŠ” μ—„μ²­λ‚œ λŸ‰μ˜
06:10
when confronted with huge amounts of data that we don't understand --
152
370260
3000
데이터λ₯Ό 이해해야 ν• λ•ŒλŠ” μ–Έμ œλ‚˜
06:13
which is that they give them a name
153
373260
2000
κ·Έλž˜μ™”λ“―μ΄ μ΄λŸ°μ‹μœΌλ‘œ κ·ΈλŸ°κ²ƒμ—
06:15
and a story.
154
375260
2000
이름과 μŠ€ν† λ¦¬λ₯Ό 뢙이죠.
06:17
So this is one that they found,
155
377260
2000
그듀이 μ°Ύμ•„λ‚Έ 것듀은 λ‹€μŒκ³Ό κ°™μ£ .
06:19
they called the Knife,
156
379260
4000
이것은 λ‚˜μ΄ν”„,
06:23
the Carnival,
157
383260
2000
μΉ΄λ‹ˆλ°œ,
06:25
the Boston Shuffler,
158
385260
4000
λ³΄μŠ€ν„΄ μ…”ν”ŒλŸ¬,
06:29
Twilight.
159
389260
2000
νŠΈμ™€μΌλΌμ΄νŠΈ.
06:31
And the gag is
160
391260
2000
그런데 μ›ƒκΈ°λŠ”κ±΄ 일단 이런 μ•Œκ³ λ¦¬μ¦˜μ΄
06:33
that, of course, these aren't just running through the market.
161
393260
3000
μžˆλŠ”κ°€λ₯Ό μ•Œμ•„λ‚΄λŠ” 방법을 ν„°λ“ν•˜λ©΄
06:36
You can find these kinds of things wherever you look,
162
396260
3000
μ¦μ‹œ λΏλ§Œμ•„λ‹ˆλΌ λͺ¨λ“  κ³³μ—μ„œ μ•Œκ³ λ¦¬μ¦˜μ΄
06:39
once you learn how to look for them.
163
399260
2000
μ‚¬μš©λ˜κ³  μžˆλ‹€λŠ”κ±Έ μ•Œ 수 μžˆλ‹€λŠ” κ±°μ£ .
06:41
You can find it here: this book about flies
164
401260
3000
이 책도 그런 예의 ν•˜λ‚˜μ£ :
06:44
that you may have been looking at on Amazon.
165
404260
2000
이건 μ•„λ§ˆμ‘΄μ—μ„œ νŒ”μ•˜λ˜ 건데
06:46
You may have noticed it
166
406260
2000
μ‹œμž‘κ°€κ°€
06:48
when its price started at 1.7 million dollars.
167
408260
2000
170만 λ‹¬λŸ¬μ˜€μ§€μš”.
06:50
It's out of print -- still ...
168
410260
2000
μ ˆνŒμ΄μ—ˆλŠ”λ°λ„ λ§μ΄μ˜ˆμš”.
06:52
(Laughter)
169
412260
2000
(μ›ƒμŒ)
06:54
If you had bought it at 1.7, it would have been a bargain.
170
414260
3000
그런데 λ§Œμ•½ 170 λ§Œλ‹¬λŸ¬μ— 샀닀면, 잘 νˆ¬μžν•œκ±°μ£ .
06:57
A few hours later, it had gone up
171
417260
2000
μ™œλƒν•˜λ©΄ 뢈과 λͺ‡ μ‹œκ°„ 뒀에,
06:59
to 23.6 million dollars,
172
419260
2000
2천360만 λ‹¬λŸ¬λ‘œ 가격이 λ›°μ—ˆμœΌλ‹ˆκΉŒμš”.
07:01
plus shipping and handling.
173
421260
2000
μš΄μ†‘λΉ„μ™€ 포μž₯λΉ„λŠ” λ³„λ„μ˜€μ£ .
07:03
And the question is:
174
423260
2000
그런데, 아무도 μ‚¬κ±°λ‚˜ νŒ”μ§€ μ•Šμ•˜λŠ”λ°
07:05
Nobody was buying or selling anything; what was happening?
175
425260
2000
μ–΄λ–»κ²Œ 그런일이 μƒκ²Όμ„κΉŒμš”?
07:07
And you see this behavior on Amazon
176
427260
2000
즉, μ›”μŠ€νŠΈλ¦¬νŠΈμ™€ λ˜‘κ°™μ€ ν˜„μƒμ„
07:09
as surely as you see it on Wall Street.
177
429260
2000
μ•„λ§ˆμ‘΄μ—μ„œλ„ λ³Ό 수 μžˆλ‹€λŠ” 말이죠.
07:11
And when you see this kind of behavior,
178
431260
2000
이런 사건이 μƒκΈ΄λ‹€λŠ” 것은
07:13
what you see is the evidence
179
433260
2000
μ•Œκ³ λ¦¬μ¦˜λ“€μ΄ μ„œλ‘œ μΆ©λŒν•˜κ³  μžˆλ‹€λŠ” 것을
07:15
of algorithms in conflict,
180
435260
2000
증λͺ…ν•΄ μ£ΌλŠ”λ° κ°„λ‹¨νžˆ λ§ν•˜λ©΄
07:17
algorithms locked in loops with each other,
181
437260
2000
μΈκ°„μ˜ μ•„λ¬΄λŸ° κ°œμž…μ΄λ‚˜ 감독이 없이
07:19
without any human oversight,
182
439260
2000
μ•Œκ³ λ¦¬μ¦˜λ“€μ΄ μ„œλ‘œ 맞물고
07:21
without any adult supervision
183
441260
3000
λ£¨ν”„μ—μ„œ λ§΄λŒλ‹€κ°€
07:24
to say, "Actually, 1.7 million is plenty."
184
444260
3000
"흠..170만 λ‹¬λŸ¬ 정도면 됐어" ν•œκ±°μ£ .
07:27
(Laughter)
185
447260
3000
(μ›ƒμŒ)
07:30
And as with Amazon, so it is with Netflix.
186
450260
3000
λ„€νŠΈν”Œλ¦­μŠ€λ„ μ•„λ§ˆμ‘΄κ³Ό 같은 ν˜•νŽΈμ— 있죠.
07:33
And so Netflix has gone through
187
453260
2000
κ·Έλž˜μ„œ λ„€νŠΈν”Œλ¦­μŠ€λŠ” μ§€λ‚œ μˆ˜λ…„κ°„
07:35
several different algorithms over the years.
188
455260
2000
μ—¬λŸ¬κ°€μ§€ λ‹€λ₯Έ μ•Œκ³ λ¦¬μ¦˜μ„ μ¨λ΄€μ§€μš”.
07:37
They started with Cinematch, and they've tried a bunch of others --
189
457260
3000
μ²˜μŒμ—λŠ” Cinematchλ₯Ό μ‚¬μš©ν•˜λ‹€κ°€
07:40
there's Dinosaur Planet; there's Gravity.
190
460260
2000
그후에 Dinasaur Planet, 그리고 Gravityλ₯Ό μ‚¬μš©ν–ˆλŠ”λ°
07:42
They're using Pragmatic Chaos now.
191
462260
2000
μ§€κΈˆμ€ Pragmatic Chaosλ₯Ό μ‚¬μš©ν•˜μ£ .
07:44
Pragmatic Chaos is, like all of Netflix algorithms,
192
464260
2000
Pragmatic Chaos도 λ„€νŠΈν”Œλ¦­μŠ€κ°€ μ‚¬μš©ν–ˆλ˜
07:46
trying to do the same thing.
193
466260
2000
λ‹€λ₯Έ μ•Œκ³ λ¦¬μ¦˜κ³Ό λ˜‘κ°™μ€ 일을 ν•˜λ €κ³  ν•˜μ§€μš”.
07:48
It's trying to get a grasp on you,
194
468260
2000
이 μ•Œκ³ λ¦¬μ¦˜μ€ 고객의 λ§ˆμŒμ•ˆμœΌλ‘œ
07:50
on the firmware inside the human skull,
195
470260
2000
파고 λ“€μ–΄κ°€μ„œ 어떀것을 μ’‹μ•„ν•˜λŠ”κ°€λ₯Ό
07:52
so that it can recommend what movie
196
472260
2000
νŒŒμ•…ν•œ 후에 λ‹€μŒλ²ˆμ— μ–΄λ–€ μ˜ν™”λ₯Ό
07:54
you might want to watch next --
197
474260
2000
보고 싢어할지λ₯Ό μΆ”μΈ‘ν•˜λ €κ³  ν•˜λŠ”λ°
07:56
which is a very, very difficult problem.
198
476260
3000
이건 μ—„μ²­λ‚˜κ²Œ μ–΄λ €μš΄ λ¬Έμ œμ§€μš”.
07:59
But the difficulty of the problem
199
479260
2000
이것이 맀우 νž˜λ“  일이고
08:01
and the fact that we don't really quite have it down,
200
481260
3000
또 μš°λ¦¬λ“€ μžμ²΄κ°€ 사싀 이런 μ•Œκ³ λ¦¬μ¦˜μ„ 아직 μ •ν™•νžˆ
08:04
it doesn't take away
201
484260
2000
μ΄ν•΄ν•˜μ§€ λͺ»ν•˜μ§€λ§Œ Pragmatic ChaosλŠ”
08:06
from the effects Pragmatic Chaos has.
202
486260
2000
κ·Έλž˜λ„ μš°λ¦¬μ—κ²Œ 영ν–₯을 미치고 μžˆμ§€μš”.
08:08
Pragmatic Chaos, like all Netflix algorithms,
203
488260
3000
과거의 λͺ¨λ“  λ„€νŠΈν”Œλ¦­μŠ€ μ•Œκ³ λ¦¬μ¦˜λ“€μ΄ κ·Έλž¬λ“―μ΄,
08:11
determines, in the end,
204
491260
2000
μš°λ¦¬λŠ” Pragmatic Chaosκ°€
08:13
60 percent
205
493260
2000
μΆ”μ²œν•˜λŠ” μ˜ν™”μ˜ μ•½
08:15
of what movies end up being rented.
206
495260
2000
60%λ₯Ό μ‹€μ œλ‘œ λΉŒλ €μ„œ λ³΄λ‹ˆκΉŒμš”.
08:17
So one piece of code
207
497260
2000
이말은 즉, μ—¬λŸ¬λΆ„μ— λŒ€ν•΄ μ§€κ·Ήνžˆ λ‹¨νŽΈμ μΈ
08:19
with one idea about you
208
499260
3000
μ•„μ΄λ””μ–΄λ§Œ 가지고 μžˆλŠ” 컴퓨터 μ½”λ“œκ°€
08:22
is responsible for 60 percent of those movies.
209
502260
3000
μ—¬λŸ¬λΆ„μ΄ μ„ νƒν•˜λŠ” μ˜ν™”μ˜ 60%λ₯Ό κ²°μ •ν•œλ‹€λŠ” κ±°μ£ .
08:25
But what if you could rate those movies
210
505260
2000
그런데 μ˜ν™”λ₯Ό λ§Œλ“€κΈ°λ„ 전에 κ·Έ μ˜ν™”μ— λŒ€ν•œ
08:27
before they get made?
211
507260
2000
평가λ₯Ό 내릴 수 μžˆλ‹€λ©΄ μ–΄λ–¨κΉŒμš”?
08:29
Wouldn't that be handy?
212
509260
2000
그럼 μ•„μ£Ό μœ μš©ν•˜μ§€ μ•Šμ„κΉŒμš”?
08:31
Well, a few data scientists from the U.K. are in Hollywood,
213
511260
3000
영ꡭ 데이터 κ³Όν•™μžλ“€μ΄ λͺ‡λͺ…이
08:34
and they have "story algorithms" --
214
514260
2000
ν™€λ¦¬μš°λ“œμ— μ΄νŒŒκ³ κΈ±μŠ€λΌλŠ”νšŒμ‚¬λ₯Ό 차리고
08:36
a company called Epagogix.
215
516260
2000
μŠ€ν† λ¦¬ μ•Œκ³ λ¦¬μ¦˜μ„ λ§Œλ“€μ—ˆμ£ .
08:38
And you can run your script through there,
216
518260
3000
κ·Έ μ•Œκ³ λ¦¬μ¦˜μ€
08:41
and they can tell you, quantifiably,
217
521260
2000
μ˜ν™” 슀크립트λ₯Ό 읽고
08:43
that that's a 30 million dollar movie
218
523260
2000
그게 3천만뢈 짜리 인지
08:45
or a 200 million dollar movie.
219
525260
2000
2μ–΅λΆˆμ§œλ¦¬ μ˜ν™”μΈμ§€ 말해 μ£Όμ£ .
08:47
And the thing is, is that this isn't Google.
220
527260
2000
이건 ꡬ글같이 정보λ₯Ό μ°ΎλŠ”κ²ƒλ„ μ•„λ‹ˆκ³ ,
08:49
This isn't information.
221
529260
2000
κΈˆμœ΅μƒνƒœλ₯Ό νŒŒμ•…ν•˜λŠ” 것도 μ•„λ‹ˆκ³ ,
08:51
These aren't financial stats; this is culture.
222
531260
2000
λ¬Έν™”λ₯Ό νŒλ‹¨ν•œλ‹€λŠ” 이야기죠.
08:53
And what you see here,
223
533260
2000
μ—¬λŸ¬λΆ„μ΄ μ§€κΈˆ λ³΄μ‹œλŠ” 이 그림은,
08:55
or what you don't really see normally,
224
535260
2000
사싀 μ—¬λŸ¬λΆ„μ΄ ν”νžˆ λ³΄λŠ” 그림은 μ•„λ‹ˆμ§€λ§Œ,
08:57
is that these are the physics of culture.
225
537260
4000
λ¬Έλͺ…μ˜ κΈ°λ³Έ 원칙을 λ³΄μ—¬μ£ΌλŠ” 그림이죠.
09:01
And if these algorithms,
226
541260
2000
그런데 이런 μ•Œκ³ λ¦¬μ¦˜λ“€μ΄
09:03
like the algorithms on Wall Street,
227
543260
2000
μ›”μŠ€νŠΈλ¦¬νŠΈμ— μžˆλŠ” μ•Œκ³ λ¦¬μ¦˜μ²˜λŸΌ
09:05
just crashed one day and went awry,
228
545260
3000
μ–΄λŠλ‚  κ°‘μžκΈ° ν¬λž˜μ‰¬ ν•œλ‹€λ©΄
09:08
how would we know?
229
548260
2000
κ·Έκ±Έ μš°λ¦¬κ°€ μ–΄λ–»κ²Œ μ•ŒκΉŒμš”?
09:10
What would it look like?
230
550260
2000
또 μ–΄λ–€ 일이 μƒκΈΈκΉŒμš”?
09:12
And they're in your house. They're in your house.
231
552260
3000
μ—¬λŸ¬λΆ„λ“€ 집에 μžˆλŠ” 이런 μ²­μ†ŒκΈ°μ˜ μ•Œκ³ λ¦¬μ¦˜λ“€μ€
09:15
These are two algorithms competing for your living room.
232
555260
2000
거싀을 μ°¨μ§€ν•˜λ €κ³  μ„œλ‘œ κ²½μŸν•˜κ³  있죠.
09:17
These are two different cleaning robots
233
557260
2000
그런데 이듀은 κΉ¨λ—ν•˜λ‹€λŠ”
09:19
that have very different ideas about what clean means.
234
559260
3000
것을 νŒλ‹¨ν•˜λŠ” 기쀀이 μ„œλ‘œ λ‹€λ₯΄μ£ .
09:22
And you can see it
235
562260
2000
μ²­μ†ŒκΈ°μ˜ 속도λ₯Ό 쀄이고
09:24
if you slow it down and attach lights to them,
236
564260
3000
라이트λ₯Ό 달면 κ·Έ 차이λ₯Ό λ³Ό 수 있죠.
09:27
and they're sort of like secret architects in your bedroom.
237
567260
3000
이듀은 μ—¬λŸ¬λΆ„μ˜ 침싀에 μžˆλŠ”
09:30
And the idea that architecture itself
238
570260
3000
μΌμ’…μ˜ μˆ¨μ–΄μžˆλŠ” 건좕가와 λΉ„μŠ·ν•˜λ‹€κ³  λ³Ό 수 μžˆκ² λŠ”λ°,
09:33
is somehow subject to algorithmic optimization
239
573260
2000
사싀은 건좕 μžμ²΄κ°€ μ‚°μˆ μ—°μ‚°μ„
09:35
is not far-fetched.
240
575260
2000
μ΅œμ ν™” ν•œ 결과라고 해도 과언은 μ•„λ‹ˆμ§€μš”.
09:37
It's super-real and it's happening around you.
241
577260
3000
이건 우리 μ£Όμœ„μ—μ„œ μ‹€μ œλ‘œ μΌμ–΄λ‚˜κ³  μžˆλŠ” μ—„μ—°ν•œ 사싀이죠.
09:40
You feel it most
242
580260
2000
μš°λ¦¬κ°€ μ‹ ν˜• 'λͺ©μ μΈ΅μ„ νƒ μ—˜λ¦¬λ² μ΄ν„°'의
09:42
when you're in a sealed metal box,
243
582260
2000
철판 λ°•μŠ€ μ•ˆμ—
09:44
a new-style elevator;
244
584260
2000
κ°™ν˜€ μžˆμ„λ•ŒλŠ”
09:46
they're called destination-control elevators.
245
586260
2000
그런 생각이 더 λ‚˜μ§€μš”.
09:48
These are the ones where you have to press what floor you're going to go to
246
588260
3000
타기 전에 승객이 λ°–μ—μ„œ
09:51
before you get in the elevator.
247
591260
2000
미리 λͺ©μ μΈ΅μ„ μ„ νƒν•˜λŠ” 이런 μ‹ ν˜• μ—˜λ¦¬λ² μ΄ν„°λŠ”
09:53
And it uses what's called a bin-packing algorithm.
248
593260
2000
'빈 νŒ¨ν‚Ή'μ΄λΌλŠ” μ•Œκ³ λ¦¬μ¦˜μ„ μ‚¬μš©ν•˜μ£ .
09:55
So none of this mishegas
249
595260
2000
μŠΉκ°λ“€μ΄ μžκΈ°κ°€ 타고 싢은 μŠΉκ°•κΈ°λ₯Ό
09:57
of letting everybody go into whatever car they want.
250
597260
2000
λ§ˆμŒλŒ€λ‘œ κ³ λ₯΄λŠ” 것은 μ˜›λ§μ΄μ£ .
09:59
Everybody who wants to go to the 10th floor goes into car two,
251
599260
2000
10측으둜 갈 μ‚¬λžŒλ“€μ€ 2번 μŠΉκ°•κΈ°λ‘œ,
10:01
and everybody who wants to go to the third floor goes into car five.
252
601260
3000
3측에 갈 μ‚¬λžŒλ“€μ€ 5번 μŠΉκ°•κΈ°λ‘œ...이런 식이죠.
10:04
And the problem with that
253
604260
2000
그런데 λ¬Έμ œλŠ”
10:06
is that people freak out.
254
606260
2000
μ‚¬λžŒλ“€μ΄ λ‹Ήν™©ν•΄μ„œ
10:08
People panic.
255
608260
2000
어쩔지 λͺ¨λ₯Έλ‹€λŠ” 것이죠.
10:10
And you see why. You see why.
256
610260
2000
그런데 λ‹Ήν™©ν•΄ ν•˜λŠ”κ²Œ λ‹Ήμ—°ν•˜μ£ .
10:12
It's because the elevator
257
612260
2000
κ·Έ μ—˜λ ˆλ² μ΄ν„°μ—λŠ”
10:14
is missing some important instrumentation, like the buttons.
258
614260
3000
μΈ΅μˆ˜κ°€ 적힌 λ²„νŠΌ κ°™μ€κ²Œ μ—†μœΌλ‹ˆκΉŒμš”.
10:17
(Laughter)
259
617260
2000
(μ›ƒμŒ)
10:19
Like the things that people use.
260
619260
2000
일반 μ—˜λ ˆλ² μ΄ν„°μ™€λŠ” μ•„μ£Ό λ‹€λ₯΄μ£ .
10:21
All it has
261
621260
2000
κ·Έμ € μœ—μͺ½μ΄λ‚˜ μ•„λž«μͺ½μœΌλ‘œ
10:23
is just the number that moves up or down
262
623260
3000
μ˜¬λΌκ°”λ‹€ λ‚΄λ €κ°”λ‹€ ν•˜λŠ” λ²ˆν˜Έμ™€
10:26
and that red button that says, "Stop."
263
626260
3000
'μŠ€ν†±' 이라고 적힌 λΉ¨κ°„ λ²„νŠΌλ§Œ 있죠.
10:29
And this is what we're designing for.
264
629260
3000
μš”μ¦˜ μš°λ¦¬κ°€ λ””μžμΈν•˜λŠ” 것듀이 λ‹€λ“€ μ΄λŸ°μ‹μ΄μ£ .
10:32
We're designing
265
632260
2000
μš°λ¦¬λŠ” μ΄λŸ°μ‹μ˜ 인간기계
10:34
for this machine dialect.
266
634260
2000
λŒ€ν™”λ₯Ό λ””μžμΈ ν•˜κ³  있죠.
10:36
And how far can you take that? How far can you take it?
267
636260
3000
μš°λ¦¬λŠ” μ΄λŸ°μ‹μœΌλ‘œ μ–Όλ§ˆλ‚˜ 더 λ‚˜κ°ˆ 수 μžˆμ„κΉŒμš”?
10:39
You can take it really, really far.
268
639260
2000
μ•„λ§ˆ 거의 λ¬΄ν•œμ • κ°ˆκ»˜μ—μš”.
10:41
So let me take it back to Wall Street.
269
641260
3000
μ›”μŠ€νŠΈλ¦¬νŠΈ μ΄μ•ΌκΈ°λ‘œ λ‹€μ‹œ λŒμ•„κ°€μ£ .
10:45
Because the algorithms of Wall Street
270
645260
2000
μ›”μŠ€νŠΈλ¦¬νŠΈμ˜ μ•Œκ³ λ¦¬μ¦˜μ—κ²Œ
10:47
are dependent on one quality above all else,
271
647260
3000
κ·Έ 무엇보닀도 더
10:50
which is speed.
272
650260
2000
μ€‘μš”ν•œ 것은 속도죠.
10:52
And they operate on milliseconds and microseconds.
273
652260
3000
그듀은 λ°€λ¦¬μ„Έμ»¨λ“œμ™€ λ§ˆμ΄ν¬λ‘œμ„Έμ»¨λ“œλ₯Ό 따지죠.
10:55
And just to give you a sense of what microseconds are,
274
655260
2000
마우슀λ₯Ό ν•œλ²ˆ ν΄λ¦­ν•˜λŠ”λ°
10:57
it takes you 500,000 microseconds
275
657260
2000
κ±Έλ¦¬λŠ” μ‹œκ°„μ€
10:59
just to click a mouse.
276
659260
2000
50만 λ§ˆμ΄ν¬λ‘œμ„Έμ»¨λ“œμ£ .
11:01
But if you're a Wall Street algorithm
277
661260
2000
μ›”μŠ€νŠΈλ¦¬νŠΈ μ•Œκ³ λ¦¬μ¦˜μ˜ μž…μž₯μ—μ„œ λ³Όλ•ŒλŠ”
11:03
and you're five microseconds behind,
278
663260
2000
단 5 λ§ˆμ΄ν¬λ‘œμ„Έμ»¨λ“œλ§Œ 뒀져도
11:05
you're a loser.
279
665260
2000
νŒ¨μžκ°€ λ˜λŠ” 것이죠.
11:07
So if you were an algorithm,
280
667260
2000
μ—¬λŸ¬λΆ„μ΄ μ•Œκ³ λ¦¬μ¦˜μ΄λΌλ©΄ μ•„λ§ˆλ„
11:09
you'd look for an architect like the one that I met in Frankfurt
281
669260
3000
μ œκ°€ ν”„λž‘ν¬ν”„λ£¨νŠΈμ—μ„œ λ§Œλ‚œ 건좕가 같은 μ‚¬λžŒλ“€μ„ μ°Ύκ² μ§€μš”.
11:12
who was hollowing out a skyscraper --
282
672260
2000
κ·Έ κ±΄μΆ•κ°€λŠ” μ•Œκ³ λ¦¬μ¦˜μ΄
11:14
throwing out all the furniture, all the infrastructure for human use,
283
674260
3000
인터넷 케이블에 μ’€ 더 κ°€κΉκ²Œ μžλ¦¬μž‘μ„ 수 있게
11:17
and just running steel on the floors
284
677260
3000
λ‹€μˆ˜μ˜ 컴퓨터 μ„œλ²„λ“€μ„ λ“€μ–΄κ°ˆ μž₯μ†Œλ₯Ό λ§Œλ“€λ €κ³ 
11:20
to get ready for the stacks of servers to go in --
285
680260
3000
콘크리트 λ°”λ‹₯만 남기고
11:23
all so an algorithm
286
683260
2000
μ‚¬λžŒλ“€μ—κ²Œ ν•„μš”ν•œ λͺ¨λ“  기반 섀비와
11:25
could get close to the Internet.
287
685260
3000
가ꡬ듀을 μ „λΆ€ λ‹€ λœ―μ–΄λ‚΄λŠ” 일을 ν–ˆμ§€μš”.
11:28
And you think of the Internet as this kind of distributed system.
288
688260
3000
인터넷은 μΌμ’…μ˜ λΆ„μ‚° μ‹œμŠ€ν…œμ΄μ§€λ§Œ 그것은
11:31
And of course, it is, but it's distributed from places.
289
691260
3000
μΌμ •ν•œ μž₯μ†Œλ“€λ‘œ λΆ€ν„° λΆ„μ‚°λ˜λŠ” μ‹œμŠ€ν…œμ΄μ£ .
11:34
In New York, this is where it's distributed from:
290
694260
2000
λ‰΄μš•μ˜ 인터넷은
11:36
the Carrier Hotel
291
696260
2000
ν—ˆλ“œμŠ¨ μŠ€νŠΈλ¦¬νŠΈμ— μžˆλŠ”
11:38
located on Hudson Street.
292
698260
2000
캐리어 ν˜Έν…”μ—μ„œ λΆ€ν„° λΆ„λ°°λ˜μ§€μš”.
11:40
And this is really where the wires come right up into the city.
293
700260
3000
λ‰΄μš•μ‹œλ‘œ μ˜€λŠ” λͺ¨λ“  인터넷 케이블이 μ‹€μ œλ‘œ λ“€μ–΄μ˜€λŠ” 곳이죠
11:43
And the reality is that the further away you are from that,
294
703260
4000
κ·Έλž˜μ„œ κ·Έκ³³μ—μ„œ λ©€λ©΄ λ©€μˆ˜λ‘ λͺ¨λ“  κ±°λž˜κ°€
11:47
you're a few microseconds behind every time.
295
707260
2000
λͺ‡ λ§ˆμ΄ν¬λ‘œμ„Έμ»¨λ“œμ”© λŠ¦μ–΄μ§€μ§€μš”.
11:49
These guys down on Wall Street,
296
709260
2000
λ§ˆμ½”ν΄λ‘œλ‚˜, μΉ˜λ‘œν‚€ λ„€μ΄μ…˜ 같은
11:51
Marco Polo and Cherokee Nation,
297
711260
2000
μ›”μŠ€νŠΈλ¦¬νŠΈ λΆ€κ·Όμ˜ λΉŒλ”©μ— μžˆλŠ” μ‚¬λžŒλ“€μ€
11:53
they're eight microseconds
298
713260
2000
캐리어 ν˜Έν…” 뢀근에 λΉŒλ”© λ‚΄λΆ€λ₯Ό ν—ˆλ¬Όμ–΄λ‚΄κ³ 
11:55
behind all these guys
299
715260
2000
μƒˆλ‘œ 인터넷 μž₯λΉ„λ₯Ό κΉŒλŠ”
11:57
going into the empty buildings being hollowed out
300
717260
4000
μ‚¬λžŒλ“€λ³΄λ‹€ 8 λ§ˆμ΄ν¬λ‘œμ„Έμ»¨λ“œ
12:01
up around the Carrier Hotel.
301
721260
2000
더 늦게 되죠.
12:03
And that's going to keep happening.
302
723260
3000
이런 일은 μ•žμœΌλ‘œλ„ 계속 μƒκΈΈκ²λ‹ˆλ‹€.
12:06
We're going to keep hollowing them out,
303
726260
2000
그렇지 μ•Šκ³ λŠ” μ¦μ‹œμ—μ„œ 거래λ₯Ό ν•  λ•Œ
12:08
because you, inch for inch
304
728260
3000
λ³΄μŠ€ν†€ μ…”ν”ŒλŸ¬μ²˜λŸΌ
12:11
and pound for pound and dollar for dollar,
305
731260
3000
λ§ˆμ§€λ§‰ ν•œλ°©μšΈ κΉŒμ§€μ˜
12:14
none of you could squeeze revenue out of that space
306
734260
3000
이읡을
12:17
like the Boston Shuffler could.
307
737260
3000
μ§œλ‚Ό 수 μ—†κΈ° λ•Œλ¬Έμ΄μ£ .
12:20
But if you zoom out,
308
740260
2000
이제 μ€Œμ•„μ›ƒμ„ ν•΄λ³΄μ§€μš”.
12:22
if you zoom out,
309
742260
2000
μ€Œμ•„μ›ƒμ„ 해보면,
12:24
you would see an 825-mile trench
310
744260
4000
λ‰΄μš•μ‹œμ™€ μ‹œμΉ΄κ³ λ₯Ό μ—°κ²°ν•˜κΈ° μœ„ν•΄
12:28
between New York City and Chicago
311
748260
2000
μŠ€ν”„λ ˆλ“œ λ„€νŠΈμ›Œν¬λΌλŠ” νšŒμ‚¬κ°€
12:30
that's been built over the last few years
312
750260
2000
μ§€λ‚œ λͺ‡λ…„에 걸쳐 κ³΅μ‚¬ν•˜κ³  μžˆλŠ”
12:32
by a company called Spread Networks.
313
752260
3000
825 마일 길이의 κΈ°λ°˜μ‹œμ„€ 곡사 전경을 λ³Ό 수 μžˆμ§€μš”.
12:35
This is a fiber optic cable
314
755260
2000
이것은 μΉ΄λ‹ˆλ°œμ΄λ‚˜ λ‚˜μ΄ν”„ μ•Œκ³ λ¦¬μ¦˜ μ „μš©μœΌλ‘œ
12:37
that was laid between those two cities
315
757260
2000
두 λ„μ‹œκ°„μ— κΉ”λ¦° 광케이블 루트인데
12:39
to just be able to traffic one signal
316
759260
3000
마우슀λ₯Ό ν΄λ¦­ν•˜λŠ” 것보닀
12:42
37 times faster than you can click a mouse --
317
762260
3000
37λ°° 더 λΉ λ₯Έ μ†λ„λ‘œ
12:45
just for these algorithms,
318
765260
3000
μ‹ ν˜Έλ₯Ό μ „μ†‘ν•˜λŠ”
12:48
just for the Carnival and the Knife.
319
768260
3000
단 ν•œκ°€μ§€μ˜ λͺ©μ μ„ μœ„ν•΄ κ±΄μ„€λœ 것이죠.
12:51
And when you think about this,
320
771260
2000
μš°λ¦¬λŠ” 아무도 μ•Œμ§€ λͺ»ν• 
12:53
that we're running through the United States
321
773260
2000
톡신 ν”„λ ˆμž„μ›Œν¬λ₯Ό κ΅¬μΆ•ν•΄μ„œ
12:55
with dynamite and rock saws
322
775260
3000
μ£Όμ‹κ±°λž˜λ₯Ό 3 λ§ˆμ΄ν¬λ‘œμ„Έμ»¨λ“œ
12:58
so that an algorithm can close the deal
323
778260
2000
더 빨리 체결할 수 있게
13:00
three microseconds faster,
324
780260
3000
미ꡭ의 곳곳을
13:03
all for a communications framework
325
783260
2000
λ‹€μ΄λ‚˜λ§ˆμ΄νŠΈμ™€ μ•”μ„ν†±μœΌλ‘œ
13:05
that no human will ever know,
326
785260
4000
ν­λ°œμ‹œν‚€κ³  자λ₯΄κ³ ν•˜λŠ”데
13:09
that's a kind of manifest destiny;
327
789260
3000
이것은 μ„œλΆ€λ‘œ νŒ½μ°½ν•˜μžλŠ” 우리의 'λͺ…λ°±ν•œ 운λͺ…'이겠고,
13:12
and we'll always look for a new frontier.
328
792260
3000
μš°λ¦¬λŠ” λ˜ν•œ μ•žμœΌλ‘œλ„ 계속 λ―Έκ°œμ²™μ§€λ₯Ό μ°Ύμ•„ λ‹€λ‹ˆκ² μ§€μš”.
13:15
Unfortunately, we have our work cut out for us.
329
795260
3000
μ•žμœΌλ‘œ μš°λ¦¬κ°€ 할일이 νƒœμ‚°κ°™μ§€μš”.
13:18
This is just theoretical.
330
798260
2000
이건 MIT의 μˆ˜ν•™μžλ“€μ΄
13:20
This is some mathematicians at MIT.
331
800260
2000
μž‘μ„±ν•œ λ‹¨μˆœνžˆ 이둠적인 그림인데
13:22
And the truth is I don't really understand
332
802260
2000
μ†”μ§νžˆ λ§ν•˜λ©΄
13:24
a lot of what they're talking about.
333
804260
2000
μ „ 그듀이 무슨 말을 ν•˜λŠ”μ§€ λͺ°λΌμš”.
13:26
It involves light cones and quantum entanglement,
334
806260
3000
'빛원뿔', 'μ–‘μžμ–½νž˜' 뭐 그런 λ§μ„ν•˜λŠ”λ°
13:29
and I don't really understand any of that.
335
809260
2000
μ „ ν•œλ§ˆλ””λ„ 이해λ₯Ό λͺ»ν•˜μ£ .
13:31
But I can read this map,
336
811260
2000
κ·Έλ ‡μ§€λ§Œ 이 μ§€λ„λŠ” μ œκ°€ μ΄ν•΄ν•˜μ£ .
13:33
and what this map says
337
813260
2000
이 지도가 λ³΄μ—¬μ£ΌλŠ” 것은
13:35
is that, if you're trying to make money on the markets where the red dots are,
338
815260
3000
λ„μ‹œκ°€ μžˆλŠ” μ¦κΆŒμ‹œμž₯이 μžˆλŠ”
13:38
that's where people are, where the cities are,
339
818260
2000
λΉ¨κ°„ μ λ“€μ—μ„œ λˆμ„ 벌렀면
13:40
you're going to have to put the servers where the blue dots are
340
820260
3000
ν‘Έλ₯Έ 점듀에 μ„œλ²„λ₯Ό μ„€μΉ˜ν•˜λŠ” 것이
13:43
to do that most effectively.
341
823260
2000
κ°€μž₯ νš¨μœ¨μ μ΄λΌλŠ” κ²ƒμž…λ‹ˆλ‹€.
13:45
And the thing that you might have noticed about those blue dots
342
825260
3000
그런데 이 μ§€λ„μ—λŠ”
13:48
is that a lot of them are in the middle of the ocean.
343
828260
3000
λŒ€μ–‘μ—λ„ ν‘Έλ₯Έ 점듀이 많이 있죠.
13:51
So that's what we'll do: we'll build bubbles or something,
344
831260
3000
그럼 이건 μ–΄λ–¨κΉŒμš”?
13:54
or platforms.
345
834260
2000
μš°λ¦¬κ°€ μ•Œκ³ λ¦¬μ¦˜μ΄ 되면
13:56
We'll actually part the water
346
836260
2000
λ°”λ‹€ ν•œκ°€μš΄λ° 큰 곡같은 κ΅¬μ‘°λ¬Όμ΄λ‚˜
13:58
to pull money out of the air,
347
838260
2000
ν”Œλž«νΌμ„ λ§Œλ“€κ³  κ±°κΈ°μ„œ λ°”λ‹€λ₯Ό
14:00
because it's a bright future
348
840260
2000
κ°€λ₯΄λŠ” 기적처럼 λŒ€κΈ°λ‘œ λΆ€ν„°
14:02
if you're an algorithm.
349
842260
2000
λˆμ„ 무진μž₯ λ§Œλ“€ 수 μžˆκ² μ§€μš”.
14:04
(Laughter)
350
844260
2000
(μ›ƒμŒ)
14:06
And it's not the money that's so interesting actually.
351
846260
3000
사싀은 돈 μžμ²΄κ°€ ν₯λ―ΈμžˆλŠ”κ²Œ μ•„λ‹ˆμ£ .
14:09
It's what the money motivates,
352
849260
2000
돈이 μ–΄λ–€ μ˜μš•μ„ μ£ΌλŠ”κ°€κ°€ μ€‘μš”ν•˜μ£ .
14:11
that we're actually terraforming
353
851260
2000
μš°λ¦¬λŠ” μ•Œκ³ λ¦¬μ¦˜ 같은 효율둜
14:13
the Earth itself
354
853260
2000
지ꡬ 자체λ₯Ό
14:15
with this kind of algorithmic efficiency.
355
855260
2000
μ‹€μ œλ‘œ λ°”κΎΈκ³  μžˆμ§€μš”.
14:17
And in that light,
356
857260
2000
μš°λ¦¬κ°€ 이런 생각을 염두에 두고
14:19
you go back
357
859260
2000
마이클 λ‚˜μžμ˜ 사진듀을 λ‹€μ‹œ ν•œλ²ˆ λ³Έλ‹€λ©΄
14:21
and you look at Michael Najjar's photographs,
358
861260
2000
그것은 사싀 메타포가 μ•„λ‹Œ
14:23
and you realize that they're not metaphor, they're prophecy.
359
863260
3000
μ˜ˆμ–Έμ΄λΌλŠ” 것을 깨달을 수 μžˆμ§€μš”.
14:26
They're prophecy
360
866260
2000
그의 사진듀은 μš°λ¦¬κ°€ λ§Œλ“œλŠ” μˆ˜ν•™μ΄
14:28
for the kind of seismic, terrestrial effects
361
868260
4000
μ§€κ΅¬μ˜ μ§€ν˜•μ— μ–Όλ§ˆλ‚˜
14:32
of the math that we're making.
362
872260
2000
κ±°λŒ€ν•œ λ³€ν™”λ₯Ό μ€„κ²ƒμ΄λΌλŠ” 것을 μ˜ˆμ–Έν•΄ μ€λ‹ˆλ‹€.
14:34
And the landscape was always made
363
874260
3000
ν’κ²½μ΄λž€ μ›λž˜
14:37
by this sort of weird, uneasy collaboration
364
877260
3000
μžμ—°κ³Ό μΈκ°„μ˜ κ΄΄μƒν•˜κ³ 
14:40
between nature and man.
365
880260
3000
κ±°λΆν•œ ν˜‘λ™μœΌλ‘œ λ§Œλ“€μ–΄ 지죠.
14:43
But now there's this third co-evolutionary force: algorithms --
366
883260
3000
κ·ΈλŸ¬λ‚˜ μ΄μ œλŠ” 풍경을 λ°”κΎΈλŠ” 제3의 μš”μ†Œκ°€ μƒκ²ΌλŠ”λ°
14:46
the Boston Shuffler, the Carnival.
367
886260
3000
그것은 즉 λ³΄μŠ€ν„΄ μ…”ν”ŒλŸ¬, μΉ΄λ‹ˆλ°œ 같은 μ•Œκ³ λ¦¬μ¦˜μ΄μ£ .
14:49
And we will have to understand those as nature,
368
889260
3000
μš°λ¦¬λŠ” 이제 μ•Œκ³ λ¦¬μ¦˜μ„ μžμ—°μ΄λΌκ³  생각해야 ν•˜λŠ”λ° μ–΄λ–»κ²Œ 보면
14:52
and in a way, they are.
369
892260
2000
사싀 그말이 λ§žμŠ΅λ‹ˆλ‹€.
14:54
Thank you.
370
894260
2000
κ°μ‚¬ν•©λ‹ˆλ‹€.
14:56
(Applause)
371
896260
20000
(λ°•μˆ˜)
이 μ›Ήμ‚¬μ΄νŠΈ 정보

이 μ‚¬μ΄νŠΈλŠ” μ˜μ–΄ ν•™μŠ΅μ— μœ μš©ν•œ YouTube λ™μ˜μƒμ„ μ†Œκ°œν•©λ‹ˆλ‹€. μ „ 세계 졜고의 μ„ μƒλ‹˜λ“€μ΄ κ°€λ₯΄μΉ˜λŠ” μ˜μ–΄ μˆ˜μ—…μ„ 보게 될 κ²ƒμž…λ‹ˆλ‹€. 각 λ™μ˜μƒ νŽ˜μ΄μ§€μ— ν‘œμ‹œλ˜λŠ” μ˜μ–΄ μžλ§‰μ„ 더블 ν΄λ¦­ν•˜λ©΄ κ·Έκ³³μ—μ„œ λ™μ˜μƒμ΄ μž¬μƒλ©λ‹ˆλ‹€. λΉ„λ””μ˜€ μž¬μƒμ— 맞좰 μžλ§‰μ΄ μŠ€ν¬λ‘€λ©λ‹ˆλ‹€. μ˜κ²¬μ΄λ‚˜ μš”μ²­μ΄ μžˆλŠ” 경우 이 문의 양식을 μ‚¬μš©ν•˜μ—¬ λ¬Έμ˜ν•˜μ‹­μ‹œμ˜€.

https://forms.gle/WvT1wiN1qDtmnspy7