How algorithms shape our world | Kevin Slavin

484,482 views ・ 2011-07-21

TED


請雙擊下方英文字幕播放視頻。

譯者: Resa CC 審譯者: Kuo-Yuan Cheng
00:15
This is a photograph
0
15260
2000
這是張相片
00:17
by the artist Michael Najjar,
1
17260
2000
由藝術家Michael Najjar 拍攝的
00:19
and it's real,
2
19260
2000
這張相片是真的
00:21
in the sense that he went there to Argentina
3
21260
2000
也就是說,他親自到阿根廷,那座山的所在處
00:23
to take the photo.
4
23260
2000
拍攝這張照片。
00:25
But it's also a fiction. There's a lot of work that went into it after that.
5
25260
3000
但也可以說,這是張虛構的相片。這張相片的完作花了很多功夫。
00:28
And what he's done
6
28260
2000
他對相片動了些手腳:
00:30
is he's actually reshaped, digitally,
7
30260
2000
數位化重整
00:32
all of the contours of the mountains
8
32260
2000
整片山脈的形體輪廓,
00:34
to follow the vicissitudes of the Dow Jones index.
9
34260
3000
使其隨道瓊指數曲線變化。
00:37
So what you see,
10
37260
2000
你所看到的
00:39
that precipice, that high precipice with the valley,
11
39260
2000
那個峭壁, 那個有處凹陷的高聳峭壁
00:41
is the 2008 financial crisis.
12
41260
2000
代表2008年的金融危機。
00:43
The photo was made
13
43260
2000
拍攝這張相片時
00:45
when we were deep in the valley over there.
14
45260
2000
我們的金融情勢正處於低谷,
00:47
I don't know where we are now.
15
47260
2000
不曉得我們現在處於何種形勢。
00:49
This is the Hang Seng index
16
49260
2000
這是恆生指數,
00:51
for Hong Kong.
17
51260
2000
香港股市價格的重要指標。
00:53
And similar topography.
18
53260
2000
(兩張相片)地形相似,
00:55
I wonder why.
19
55260
2000
我想知道為什麼
00:57
And this is art. This is metaphor.
20
57260
3000
這是藝術;這是種象徵。
01:00
But I think the point is
21
60260
2000
但我認為重點是
01:02
that this is metaphor with teeth,
22
62260
2000
這個象徵有“牙齒”。
01:04
and it's with those teeth that I want to propose today
23
64260
3000
就是因為這些“牙齒”,我今天提議
01:07
that we rethink a little bit
24
67260
2000
我們稍微重新思考
01:09
about the role of contemporary math --
25
69260
3000
當代數學的角色;
01:12
not just financial math, but math in general.
26
72260
3000
不只金融數學,還有普通數學。
01:15
That its transition
27
75260
2000
「它」的演變:
01:17
from being something that we extract and derive from the world
28
77260
3000
從我們鑽研這個世界,抽絲撥繭而取得的發現
01:20
to something that actually starts to shape it --
29
80260
3000
到實際開始形成「它」的重要發現,
01:23
the world around us and the world inside us.
30
83260
3000
這含括我們的外在世界和我們內在的世界。
01:26
And it's specifically algorithms,
31
86260
2000
說明確些,它是演算法,
01:28
which are basically the math
32
88260
2000
基本上,是種數學─
01:30
that computers use to decide stuff.
33
90260
3000
─電腦用來測定東西的數學。
01:33
They acquire the sensibility of truth
34
93260
2000
演算法掌握高度精確的計量,
01:35
because they repeat over and over again,
35
95260
2000
因為它們一而再,再而三的重覆著;
01:37
and they ossify and calcify,
36
97260
3000
然後漸漸成型,發展出基本架構
01:40
and they become real.
37
100260
2000
然後它們變得實際且可靠。
01:42
And I was thinking about this, of all places,
38
102260
3000
我當時正在思考這點, 真是太湊巧了!
01:45
on a transatlantic flight a couple of years ago,
39
105260
3000
就在幾年前,橫越大西洋的班機上,
01:48
because I happened to be seated
40
108260
2000
因為我的座位碰巧
01:50
next to a Hungarian physicist about my age
41
110260
2000
在一位年紀與我相仿的匈牙利物理學家隔壁
01:52
and we were talking
42
112260
2000
我們談論關於
01:54
about what life was like during the Cold War
43
114260
2000
匈牙利冷戰期間
01:56
for physicists in Hungary.
44
116260
2000
物理學家的生活情況。
01:58
And I said, "So what were you doing?"
45
118260
2000
我說:「你那時在做什麼?」
02:00
And he said, "Well we were mostly breaking stealth."
46
120260
2000
他說:「嗯,我們大多在打擊祕密行動。」
02:02
And I said, "That's a good job. That's interesting.
47
122260
2000
「那是個好工作,有趣吧,
02:04
How does that work?"
48
124260
2000
那是怎麼運作的?」
02:06
And to understand that,
49
126260
2000
要了解那之前
02:08
you have to understand a little bit about how stealth works.
50
128260
3000
你必須稍稍了解祕密行動的運作。
02:11
And so -- this is an over-simplification --
51
131260
3000
這是個超簡單化的例子,
02:14
but basically, it's not like
52
134260
2000
基本上,它不像是
02:16
you can just pass a radar signal
53
136260
2000
你可以藉由156噸在天空飛的鋼鐵
02:18
right through 156 tons of steel in the sky.
54
138260
3000
傳送雷達信號。
02:21
It's not just going to disappear.
55
141260
3000
飛機不會消失不見。
02:24
But if you can take this big, massive thing,
56
144260
3000
但若你能將這個龐大、具規模的東西
02:27
and you could turn it into
57
147260
3000
變成
02:30
a million little things --
58
150260
2000
百萬個小玩意
02:32
something like a flock of birds --
59
152260
2000
─像鳥群一樣的東西─
02:34
well then the radar that's looking for that
60
154260
2000
那麼雷達偵測到那一群群的小東西
02:36
has to be able to see
61
156260
2000
必定會看到
02:38
every flock of birds in the sky.
62
158260
2000
在空中有“一群群的鳥”
02:40
And if you're a radar, that's a really bad job.
63
160260
4000
若你是一個雷達,事情可就糟了。
02:44
And he said, "Yeah." He said, "But that's if you're a radar.
64
164260
3000
他說:「對,但那是,如果你是個雷達。
02:47
So we didn't use a radar;
65
167260
2000
所以我們不用雷達,
02:49
we built a black box that was looking for electrical signals,
66
169260
3000
我們建造一個黑箱子,用它搜尋電波,
02:52
electronic communication.
67
172260
3000
電子通訊。
02:55
And whenever we saw a flock of birds that had electronic communication,
68
175260
3000
任何時候,我們發現帶有電子通訊的鳥群,
02:58
we thought, 'Probably has something to do with the Americans.'"
69
178260
3000
我們會認為這很可能跟美國人有關。」
03:01
And I said, "Yeah.
70
181260
2000
我接著說:「是啊,
03:03
That's good.
71
183260
2000
真行,
03:05
So you've effectively negated
72
185260
2000
你們成功地消磨了
03:07
60 years of aeronautic research.
73
187260
2000
60年的航空學研究心血。
03:09
What's your act two?
74
189260
2000
你接著要做什麼?
03:11
What do you do when you grow up?"
75
191260
2000
當你長大成人以後,你從事什麼工作?」
03:13
And he said,
76
193260
2000
他回答:
03:15
"Well, financial services."
77
195260
2000
「嗯,金融服務業。」
03:17
And I said, "Oh."
78
197260
2000
我驚呼:「喔!」
03:19
Because those had been in the news lately.
79
199260
3000
那一陣子相關報導一直在新聞出現。
03:22
And I said, "How does that work?"
80
202260
2000
我問:「進行的如何?」
03:24
And he said, "Well there's 2,000 physicists on Wall Street now,
81
204260
2000
他說:「現有2,000名物理學家在華爾街(美國金融中心),
03:26
and I'm one of them."
82
206260
2000
我是他們其中一人。」
03:28
And I said, "What's the black box for Wall Street?"
83
208260
3000
我接著問:「華爾街用的『黑箱』是什麼?」
03:31
And he said, "It's funny you ask that,
84
211260
2000
他回說:「你這樣問很好笑,
03:33
because it's actually called black box trading.
85
213260
3000
事實上,人們會稱它為『黑箱交易』
03:36
And it's also sometimes called algo trading,
86
216260
2000
有時也稱為
03:38
algorithmic trading."
87
218260
3000
「演算法交易」(algorithmic trading 或algo trading)
03:41
And algorithmic trading evolved in part
88
221260
3000
「演算法交易」的演進發展,有部分是因為
03:44
because institutional traders have the same problems
89
224260
3000
某些機構交易人遇到相同的問題;
03:47
that the United States Air Force had,
90
227260
3000
而那問題美國空軍也同樣遭遇到,
03:50
which is that they're moving these positions --
91
230260
3000
他們都在「移動這些位置」──
03:53
whether it's Proctor & Gamble or Accenture, whatever --
92
233260
2000
不論是寶僑(Proctor&Gamble)或埃森哲(Accenture:管理顧問、技術服務公司)
03:55
they're moving a million shares of something
93
235260
2000
他們都在移動一百萬股的東西,
03:57
through the market.
94
237260
2000
透過市場交易而進行。
03:59
And if they do that all at once,
95
239260
2000
如果他們一次就挪動全部,
04:01
it's like playing poker and going all in right away.
96
241260
2000
就像玩撲克牌,把剩下的所有籌碼一次全部壓上,
04:03
You just tip your hand.
97
243260
2000
你只會過早洩露底餡;
04:05
And so they have to find a way --
98
245260
2000
所以他們必須找到方法
04:07
and they use algorithms to do this --
99
247260
2000
─他們用演算法,有系統的操作─
04:09
to break up that big thing
100
249260
2000
將龐然大數化整為零
04:11
into a million little transactions.
101
251260
2000
成為百萬個小交易。
04:13
And the magic and the horror of that
102
253260
2000
恐怖的是這個魔術正是
04:15
is that the same math
103
255260
2000
「相同的數學」
04:17
that you use to break up the big thing
104
257260
2000
─用來瓦解龐然巨物
04:19
into a million little things
105
259260
2000
變成百萬個小東西─
04:21
can be used to find a million little things
106
261260
2000
可以用來計算出百萬個零星單位
04:23
and sew them back together
107
263260
2000
又將他們統整在一起
04:25
and figure out what's actually happening in the market.
108
265260
2000
並推算出實際在市場上發生的事情。
04:27
So if you need to have some image
109
267260
2000
如果你立即需要
04:29
of what's happening in the stock market right now,
110
269260
3000
一些股市交易的樣貌,
04:32
what you can picture is a bunch of algorithms
111
272260
2000
你可以構想到的是,成串的運算法
04:34
that are basically programmed to hide,
112
274260
3000
基本上被設計為隱藏不顯示
04:37
and a bunch of algorithms that are programmed to go find them and act.
113
277260
3000
和成串的運算法被設計為可搜尋並執行。
04:40
And all of that's great, and it's fine.
114
280260
3000
整個設計的真是太棒了,又精確。
04:43
And that's 70 percent
115
283260
2000
那是百分之七十的
04:45
of the United States stock market,
116
285260
2000
美國股票市場,
04:47
70 percent of the operating system
117
287260
2000
這個百分之七十的營運系統
04:49
formerly known as your pension,
118
289260
3000
之前堪稱為某些人的“退休金”
04:52
your mortgage.
119
292260
3000
某人的“抵押借款”。
04:55
And what could go wrong?
120
295260
2000
會有什麼錯呢?
04:57
What could go wrong
121
297260
2000
事情出了差池:
04:59
is that a year ago,
122
299260
2000
一年前
05:01
nine percent of the entire market just disappears in five minutes,
123
301260
3000
整體股市的百分之九突然消失了五分鐘,
05:04
and they called it the Flash Crash of 2:45.
124
304260
3000
人們稱之為『瞬間當機2:45』
05:07
All of a sudden, nine percent just goes away,
125
307260
3000
突然, 百分之九就這樣不見了,
05:10
and nobody to this day
126
310260
2000
直到今天,仍沒有人
05:12
can even agree on what happened
127
312260
2000
對發生的事取得一致的意見,
05:14
because nobody ordered it, nobody asked for it.
128
314260
3000
因為沒人“下令”當機;沒人自找麻煩。
05:17
Nobody had any control over what was actually happening.
129
317260
3000
大家對實際正在發生的事情束手無策
05:20
All they had
130
320260
2000
他們只有
05:22
was just a monitor in front of them
131
322260
2000
盯著面前的電腦螢幕,
05:24
that had the numbers on it
132
324260
2000
電腦螢幕上的數字,
05:26
and just a red button
133
326260
2000
和一顆紅色按紐
05:28
that said, "Stop."
134
328260
2000
上面寫著: 『停止』
05:30
And that's the thing,
135
330260
2000
事情就是這樣,
05:32
is that we're writing things,
136
332260
2000
我們正在編寫的「東西」,
05:34
we're writing these things that we can no longer read.
137
334260
3000
我們正在編寫這些連自己都看不懂的東西。
05:37
And we've rendered something
138
337260
2000
我們已經對「某種東西」投降了,
05:39
illegible,
139
339260
2000
某種「難以辨識」的東西。
05:41
and we've lost the sense
140
341260
3000
而且我們失去了
05:44
of what's actually happening
141
344260
2000
對實際正發生之事的判別力
05:46
in this world that we've made.
142
346260
2000
就在我們自己創造的這個世界中,
05:48
And we're starting to make our way.
143
348260
2000
況且我們正開始邁向成功。
05:50
There's a company in Boston called Nanex,
144
350260
3000
在波士頓有間公司叫Nanex(該公司開發市場數據供給系統),
05:53
and they use math and magic
145
353260
2000
他們用數學和魔法
05:55
and I don't know what,
146
355260
2000
和我不知道的什麼來的
05:57
and they reach into all the market data
147
357260
2000
他們深入研究市場數據資料
05:59
and they find, actually sometimes, some of these algorithms.
148
359260
3000
他們確實發現值得重視的東西:某些演算法
06:02
And when they find them they pull them out
149
362260
3000
當他們發現這些演算程序,便把它們擷取出來
06:05
and they pin them to the wall like butterflies.
150
365260
3000
並將它們像蝴蝶一樣釘在牆上。
06:08
And they do what we've always done
151
368260
2000
他們做大家總是會做的事情,
06:10
when confronted with huge amounts of data that we don't understand --
152
370260
3000
當面臨龐大又不懂的數據資料時,
06:13
which is that they give them a name
153
373260
2000
為其命名
06:15
and a story.
154
375260
2000
和揑造故事。
06:17
So this is one that they found,
155
377260
2000
這是他們的發現:
06:19
they called the Knife,
156
379260
4000
他們稱為『刀』
06:23
the Carnival,
157
383260
2000
『嘉年華會』(Carnival)
06:25
the Boston Shuffler,
158
385260
4000
『波士頓通勤者』(Boston Shuffler )
06:29
Twilight.
159
389260
2000
『暮光』
06:31
And the gag is
160
391260
2000
好玩的是
06:33
that, of course, these aren't just running through the market.
161
393260
3000
當然,這些不光是存在於金融市場;
06:36
You can find these kinds of things wherever you look,
162
396260
3000
你能在任何你看得到的地方,發現這些東西,
06:39
once you learn how to look for them.
163
399260
2000
一旦你明白如何找尋到它們(演算法)。
06:41
You can find it here: this book about flies
164
401260
3000
從這兒你可以發現:這是本關於蒼蠅的書,
06:44
that you may have been looking at on Amazon.
165
404260
2000
你可能已在亞馬遜看到這本書;
06:46
You may have noticed it
166
406260
2000
你可能已經注意到
06:48
when its price started at 1.7 million dollars.
167
408260
2000
它的價格從一百七十萬元起價時,
06:50
It's out of print -- still ...
168
410260
2000
這本書是絶版的......仍然絶版中。
06:52
(Laughter)
169
412260
2000
(笑笑)
06:54
If you had bought it at 1.7, it would have been a bargain.
170
414260
3000
如果能以一百七十萬的價格買下它是很划算的
06:57
A few hours later, it had gone up
171
417260
2000
稍後幾小時,它飆漲至
06:59
to 23.6 million dollars,
172
419260
2000
兩千三百六十萬元,
07:01
plus shipping and handling.
173
421260
2000
包含運費和手續費。
07:03
And the question is:
174
423260
2000
問題是:
07:05
Nobody was buying or selling anything; what was happening?
175
425260
2000
這並無產生任何買賣行為;發生了什麼事?
07:07
And you see this behavior on Amazon
176
427260
2000
你在亞馬遜見到這樣的行為,
07:09
as surely as you see it on Wall Street.
177
429260
2000
確實跟你在華爾街看到的一般。
07:11
And when you see this kind of behavior,
178
431260
2000
當你見到這種行為:
07:13
what you see is the evidence
179
433260
2000
你所看到的顯然正是
07:15
of algorithms in conflict,
180
435260
2000
矛盾的演算程序,
07:17
algorithms locked in loops with each other,
181
437260
2000
演算程序被彼此套住,卡在電腦程式回路中;
07:19
without any human oversight,
182
439260
2000
沒有任何“人類監管”
07:21
without any adult supervision
183
441260
3000
沒有任何“成人監護”
07:24
to say, "Actually, 1.7 million is plenty."
184
444260
3000
來告訴你,“其實,一百七十萬已經夠多了!”
07:27
(Laughter)
185
447260
3000
(笑笑)
07:30
And as with Amazon, so it is with Netflix.
186
450260
3000
如同亞馬遜,Netflix(美國公司,經營線上串流影片)也一樣。
07:33
And so Netflix has gone through
187
453260
2000
多年來, Netflix採用過
07:35
several different algorithms over the years.
188
455260
2000
好幾個不同的演算程序。
07:37
They started with Cinematch, and they've tried a bunch of others --
189
457260
3000
他們從Cinematch(推薦系統軟體)開始,也試了一連串其他的軟體。
07:40
there's Dinosaur Planet; there's Gravity.
190
460260
2000
有Dinosaur Planet團隊、Gravity團隊各別研發的推薦系統。
07:42
They're using Pragmatic Chaos now.
191
462260
2000
他們現在使用 Pragmatic Chaos研發的系統。
07:44
Pragmatic Chaos is, like all of Netflix algorithms,
192
464260
2000
像所有Netflix的運算系統,
07:46
trying to do the same thing.
193
466260
2000
Pragmatic Chaos研發的推薦系統,試圖做相同的事。
07:48
It's trying to get a grasp on you,
194
468260
2000
它試著去掌控你們,
07:50
on the firmware inside the human skull,
195
470260
2000
控制人類頭顱內的思考邏輯,
07:52
so that it can recommend what movie
196
472260
2000
以便它能推薦你
07:54
you might want to watch next --
197
474260
2000
下次你也許想看的電影─
07:56
which is a very, very difficult problem.
198
476260
3000
─這是非常高難度的難題。
07:59
But the difficulty of the problem
199
479260
2000
但問題和事實的艱難度
08:01
and the fact that we don't really quite have it down,
200
481260
3000
─我們不是真的掌握問題的事實─
08:04
it doesn't take away
201
484260
2000
並沒減損
08:06
from the effects Pragmatic Chaos has.
202
486260
2000
Pragmatic Chaos的影嚮。
08:08
Pragmatic Chaos, like all Netflix algorithms,
203
488260
3000
Pragmatic Chaos,如同所有Netflix運算系統,
08:11
determines, in the end,
204
491260
2000
至終裁定
08:13
60 percent
205
493260
2000
百分之六十的
08:15
of what movies end up being rented.
206
495260
2000
哪些電影最後會被租借。
08:17
So one piece of code
207
497260
2000
所以一片程式編碼
08:19
with one idea about you
208
499260
3000
─紀錄著你們看片的喜好─
08:22
is responsible for 60 percent of those movies.
209
502260
3000
得為百分之六十的電影負責。
08:25
But what if you could rate those movies
210
505260
2000
但倘若你能評估這些電影,
08:27
before they get made?
211
507260
2000
在電影製作前作預測呢?
08:29
Wouldn't that be handy?
212
509260
2000
那不就簡便多了?
08:31
Well, a few data scientists from the U.K. are in Hollywood,
213
511260
3000
嗯,在好萊塢,一些來自英國的數據科學家
08:34
and they have "story algorithms" --
214
514260
2000
擁有故事情節演算程式系統──
08:36
a company called Epagogix.
215
516260
2000
一間公司叫Epagogix(英國一家預測劇本未來票房好壞的公司)
08:38
And you can run your script through there,
216
518260
3000
你可以拿劇本請這間公司幫你預測;
08:41
and they can tell you, quantifiably,
217
521260
2000
他們會提供你數據:
08:43
that that's a 30 million dollar movie
218
523260
2000
那是一部可賣三千萬的電影
08:45
or a 200 million dollar movie.
219
525260
2000
或是一部兩億的賣座電影。
08:47
And the thing is, is that this isn't Google.
220
527260
2000
事情是......這不是Google;
08:49
This isn't information.
221
529260
2000
這不是情報資料;
08:51
These aren't financial stats; this is culture.
222
531260
2000
這些不是金融統計;這是文化。
08:53
And what you see here,
223
533260
2000
你們在這裡見到的,
08:55
or what you don't really see normally,
224
535260
2000
或者說,實際上,你通常不會察覺的
08:57
is that these are the physics of culture.
225
537260
4000
是物理文化
09:01
And if these algorithms,
226
541260
2000
而且若這些演算系統
09:03
like the algorithms on Wall Street,
227
543260
2000
像華爾街的演算系統
09:05
just crashed one day and went awry,
228
545260
3000
某天突然當機,出岔子了
09:08
how would we know?
229
548260
2000
我們如何會知道.....
09:10
What would it look like?
230
550260
2000
那會如何?
09:12
And they're in your house. They're in your house.
231
552260
3000
再者,它們就在你的房子內,它們就在你的房子內
09:15
These are two algorithms competing for your living room.
232
555260
2000
兩個演算系統在競爭你的客廳。
09:17
These are two different cleaning robots
233
557260
2000
兩個不同的清潔機器人
09:19
that have very different ideas about what clean means.
234
559260
3000
對乾淨的定義有不同的概念。
09:22
And you can see it
235
562260
2000
而且你能從中看到演算程序,
09:24
if you slow it down and attach lights to them,
236
564260
3000
如果讓它慢下來,為它們裝上LCD燈的話,你們就能見識到。
09:27
and they're sort of like secret architects in your bedroom.
237
567260
3000
而且他們有點像在你卧房內的袐密建築師。
09:30
And the idea that architecture itself
238
570260
3000
況且建築學本身的概念
09:33
is somehow subject to algorithmic optimization
239
573260
2000
從某種角度而言,是基於演算法的最佳化
09:35
is not far-fetched.
240
575260
2000
一點也不牽強喔,
09:37
It's super-real and it's happening around you.
241
577260
3000
超真實而且就在存在你週遭。
09:40
You feel it most
242
580260
2000
你感受最深的時刻是,
09:42
when you're in a sealed metal box,
243
582260
2000
當你在一個密閉的金屬箱子內
09:44
a new-style elevator;
244
584260
2000
─一臺新型的電梯─
09:46
they're called destination-control elevators.
245
586260
2000
他們被稱為「終點控制電梯」。
09:48
These are the ones where you have to press what floor you're going to go to
246
588260
3000
這些是電梯,你可以按鈕到你要去的樓層
09:51
before you get in the elevator.
247
591260
2000
在你“進電梯前”按鈕。
09:53
And it uses what's called a bin-packing algorithm.
248
593260
2000
它使用所謂的「裝著演算法的盒子」。
09:55
So none of this mishegas
249
595260
2000
也就是說,這一點也不異常或瘋狂,
09:57
of letting everybody go into whatever car they want.
250
597260
2000
讓每個人選擇進入任何一台電梯。
09:59
Everybody who wants to go to the 10th floor goes into car two,
251
599260
2000
要到十樓的人進入二號電梯;
10:01
and everybody who wants to go to the third floor goes into car five.
252
601260
3000
要到三樓的人進入五號電梯。
10:04
And the problem with that
253
604260
2000
問題是
10:06
is that people freak out.
254
606260
2000
人們嚇壞了
10:08
People panic.
255
608260
2000
人們驚慌失措。
10:10
And you see why. You see why.
256
610260
2000
你看看為什麼......你看看為什麼......
10:12
It's because the elevator
257
612260
2000
原因是:
10:14
is missing some important instrumentation, like the buttons.
258
614260
3000
電梯缺少了某些種要的儀表,譬如說「按鈕」
10:17
(Laughter)
259
617260
2000
(笑笑)
10:19
Like the things that people use.
260
619260
2000
人們會使用那個東西。
10:21
All it has
261
621260
2000
電梯內只顯示
10:23
is just the number that moves up or down
262
623260
3000
上樓或下樓的數字
10:26
and that red button that says, "Stop."
263
626260
3000
還有紅色的按鈕,寫著:『停止』
10:29
And this is what we're designing for.
264
629260
3000
而這是我們正在設計的,
10:32
We're designing
265
632260
2000
我們正在設計
10:34
for this machine dialect.
266
634260
2000
這種「機器方言」。
10:36
And how far can you take that? How far can you take it?
267
636260
3000
你可以作到什麼樣程度?你可以利用它到何種境界?
10:39
You can take it really, really far.
268
639260
2000
你可以“搭乘它(演算法)”至無遠弗界。
10:41
So let me take it back to Wall Street.
269
641260
3000
讓我們退回到華爾街,
10:45
Because the algorithms of Wall Street
270
645260
2000
因為華爾街的演算系統
10:47
are dependent on one quality above all else,
271
647260
3000
仰賴某種性質更勝於一切
10:50
which is speed.
272
650260
2000
即「速度」。
10:52
And they operate on milliseconds and microseconds.
273
652260
3000
他們以毫秒和微秒運作
10:55
And just to give you a sense of what microseconds are,
274
655260
2000
讓你了解什麼是微秒,
10:57
it takes you 500,000 microseconds
275
657260
2000
你需要花五十萬微秒
10:59
just to click a mouse.
276
659260
2000
去點擊滑鼠;
11:01
But if you're a Wall Street algorithm
277
661260
2000
若你是華爾街的演算法
11:03
and you're five microseconds behind,
278
663260
2000
而你落後了五微秒,
11:05
you're a loser.
279
665260
2000
你就是失敗者。
11:07
So if you were an algorithm,
280
667260
2000
所以,倘若你是一個演算法,
11:09
you'd look for an architect like the one that I met in Frankfurt
281
669260
3000
你會找一個建築師,像我在法蘭克福市遇到的那位,
11:12
who was hollowing out a skyscraper --
282
672260
2000
掏空摩天大樓,
11:14
throwing out all the furniture, all the infrastructure for human use,
283
674260
3000
扔掉所有傢俱、所有供人類使用的基礎建設,
11:17
and just running steel on the floors
284
677260
3000
只有鋼鐵舖地
11:20
to get ready for the stacks of servers to go in --
285
680260
3000
準備好讓大批的伺服器入駐。
11:23
all so an algorithm
286
683260
2000
整個如此的演算程序
11:25
could get close to the Internet.
287
685260
3000
能使網路通路密切而有效率。
11:28
And you think of the Internet as this kind of distributed system.
288
688260
3000
再者,你們認為網路是種分散式系統。
11:31
And of course, it is, but it's distributed from places.
289
691260
3000
當然,它是;可是,是從各個定點分散
11:34
In New York, this is where it's distributed from:
290
694260
2000
在紐約,這裡是分佈的中心據點:
11:36
the Carrier Hotel
291
696260
2000
電信機房(Carrier Hotel)
11:38
located on Hudson Street.
292
698260
2000
座落在哈德森街(Hudson Street)
11:40
And this is really where the wires come right up into the city.
293
700260
3000
這裡的確是電纜貫穿整座城市的源頭。
11:43
And the reality is that the further away you are from that,
294
703260
4000
事實是,離那裡越遠
11:47
you're a few microseconds behind every time.
295
707260
2000
每一次就落後數微秒。
11:49
These guys down on Wall Street,
296
709260
2000
在華爾街這一帶的“這些傢伙”
11:51
Marco Polo and Cherokee Nation,
297
711260
2000
Marco Polo和Cherokee Nation
11:53
they're eight microseconds
298
713260
2000
他們落後八微秒,
11:55
behind all these guys
299
715260
2000
落後所有“這些傢伙”
11:57
going into the empty buildings being hollowed out
300
717260
4000
這些傢伙進入被掏空的建築物
12:01
up around the Carrier Hotel.
301
721260
2000
而這些建築座落接近電信機房的周邊。
12:03
And that's going to keep happening.
302
723260
3000
而且那將會持續發生
12:06
We're going to keep hollowing them out,
303
726260
2000
─這些建築物將會持續被掏空─
12:08
because you, inch for inch
304
728260
3000
因為每一英寸
12:11
and pound for pound and dollar for dollar,
305
731260
3000
每一磅和每一(美)元
12:14
none of you could squeeze revenue out of that space
306
734260
3000
你們沒人能從那個空間距離強擠出收益
12:17
like the Boston Shuffler could.
307
737260
3000
像『波士頓通勤者』那般。
12:20
But if you zoom out,
308
740260
2000
但如果縮小地圖
12:22
if you zoom out,
309
742260
2000
縮小地圖
12:24
you would see an 825-mile trench
310
744260
4000
你會看到825英里(1327.7公里)的溝渠
12:28
between New York City and Chicago
311
748260
2000
在紐約和芝加哥之間,
12:30
that's been built over the last few years
312
750260
2000
已建立有幾年了
12:32
by a company called Spread Networks.
313
752260
3000
由Spread Networks 經營。
12:35
This is a fiber optic cable
314
755260
2000
這一道光纖電纜
12:37
that was laid between those two cities
315
757260
2000
被設置在兩城市間
12:39
to just be able to traffic one signal
316
759260
3000
只為一個信號的傳遞
12:42
37 times faster than you can click a mouse --
317
762260
3000
能以37倍速快過點擊滑鼠─
12:45
just for these algorithms,
318
765260
3000
─只為了這些演算系統;
12:48
just for the Carnival and the Knife.
319
768260
3000
只為了『嘉年華會』和『刀』。
12:51
And when you think about this,
320
771260
2000
當你們想著這點時,
12:53
that we're running through the United States
321
773260
2000
我們正以炸藥與岩石鋸貫穿、
12:55
with dynamite and rock saws
322
775260
3000
損耗美國,
12:58
so that an algorithm can close the deal
323
778260
2000
以便一個演算法能快速達成交易
13:00
three microseconds faster,
324
780260
3000
─以減少三微秒的速度─
13:03
all for a communications framework
325
783260
2000
全都為了一個人類
13:05
that no human will ever know,
326
785260
4000
將永不會明瞭的通訊機制
13:09
that's a kind of manifest destiny;
327
789260
3000
那是一種顯而易見的定數
13:12
and we'll always look for a new frontier.
328
792260
3000
且將永遠不斷地尋找未開拓的新領域。
13:15
Unfortunately, we have our work cut out for us.
329
795260
3000
不幸的是,我們必須要完成這個任務。
13:18
This is just theoretical.
330
798260
2000
這只是一個理論。
13:20
This is some mathematicians at MIT.
331
800260
2000
這是某些在麻省理工學院(MIT)的數學家製作的
13:22
And the truth is I don't really understand
332
802260
2000
事實上,我不真的都了解
13:24
a lot of what they're talking about.
333
804260
2000
他們在談論些什麼
13:26
It involves light cones and quantum entanglement,
334
806260
3000
它涉及光圓錐體和量子糾結
13:29
and I don't really understand any of that.
335
809260
2000
我不真的了解那是什麼
13:31
But I can read this map,
336
811260
2000
但我會讀這面地圖。
13:33
and what this map says
337
813260
2000
這面地圖指示
13:35
is that, if you're trying to make money on the markets where the red dots are,
338
815260
3000
如果你試圖在有紅色點點的市場中賺錢
13:38
that's where people are, where the cities are,
339
818260
2000
也就是在人們聚集的地方及市鎮重心,
13:40
you're going to have to put the servers where the blue dots are
340
820260
3000
你就必須將伺服器設置在藍色點點的地方
13:43
to do that most effectively.
341
823260
2000
讓運作效率最大化。
13:45
And the thing that you might have noticed about those blue dots
342
825260
3000
你也許注意到那些藍色點點的分佈,
13:48
is that a lot of them are in the middle of the ocean.
343
828260
3000
很多藍色點點在海的中央;
13:51
So that's what we'll do: we'll build bubbles or something,
344
831260
3000
所以,我們要怎麼做:我們要建立透明圓外罩(bubbles意同泡泡)或什麼來的
13:54
or platforms.
345
834260
2000
或者很多平臺。
13:56
We'll actually part the water
346
836260
2000
我們將能確實分開海水
13:58
to pull money out of the air,
347
838260
2000
將錢從空氣中抽取出,
14:00
because it's a bright future
348
840260
2000
未來是光明閃亮的
14:02
if you're an algorithm.
349
842260
2000
如果你自己就是一個演算法的話。
14:04
(Laughter)
350
844260
2000
(笑笑)
14:06
And it's not the money that's so interesting actually.
351
846260
3000
然而,事實上,不是錢有趣
14:09
It's what the money motivates,
352
849260
2000
而是錢激發的東西引人入勝─
14:11
that we're actually terraforming
353
851260
2000
─我們能確實地地球化(terraforming)
14:13
the Earth itself
354
853260
2000
地球本身,
14:15
with this kind of algorithmic efficiency.
355
855260
2000
透過演算法具有的最佳效率(能)。
14:17
And in that light,
356
857260
2000
根據這點,
14:19
you go back
357
859260
2000
咱們回到前面,
14:21
and you look at Michael Najjar's photographs,
358
861260
2000
看著Michael Najjar的相片
14:23
and you realize that they're not metaphor, they're prophecy.
359
863260
3000
我們領悟到:他們不是象徵;他們是預言
14:26
They're prophecy
360
866260
2000
他們預言了
14:28
for the kind of seismic, terrestrial effects
361
868260
4000
數學之地震效應、陸地效應
14:32
of the math that we're making.
362
872260
2000
即將發生在我們創造出來的數學世界中。
14:34
And the landscape was always made
363
874260
3000
而且這風貌過去一直是由自然界和人之間
14:37
by this sort of weird, uneasy collaboration
364
877260
3000
不可思議的協作及不易妥協而創作出來的,
14:40
between nature and man.
365
880260
3000
是自然界和人之間的對話。
14:43
But now there's this third co-evolutionary force: algorithms --
366
883260
3000
但現在有第三股共同演化勢力:演算系統
14:46
the Boston Shuffler, the Carnival.
367
886260
3000
『波士頓通勤者』、『嘉年華會』
14:49
And we will have to understand those as nature,
368
889260
3000
我們必須明白這些皆為自然。
14:52
and in a way, they are.
369
892260
2000
在某種程度上,它們是!
14:54
Thank you.
370
894260
2000
謝謝大家
14:56
(Applause)
371
896260
20000
(掌聲熱烈)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7