How we need to remake the internet | Jaron Lanier

430,004 views ・ 2018-05-03

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: 易帆 余
00:12
Back in the 1980s, actually, I gave my first talk at TED,
0
12944
4009
我的第一場 TED 演講 是在 1980 年代,
00:16
and I brought some of the very, very first public demonstrations
1
16977
4262
那時我把最早最早的一些
00:21
of virtual reality ever to the TED stage.
2
21263
4234
虛擬實境展示帶到 TED 舞台上。
00:26
And at that time, we knew that we were facing a knife-edge future
3
26375
6867
當時,大家都知道我們 正在面對一個不確定的未來、
00:33
where the technology we needed,
4
33266
5201
在那個未來世界裡, 有我們需要的科技、
00:38
the technology we loved,
5
38491
1851
喜愛的科技,
00:40
could also be our undoing.
6
40366
2047
但也有可能是毀滅我們的科技。
00:43
We knew that if we thought of our technology
7
43266
4091
我們知道,如果我們把科技視為
00:47
as a means to ever more power,
8
47381
3254
通往更多權力的手段、
00:50
if it was just a power trip, we'd eventually destroy ourselves.
9
50659
3707
玩權弄勢的工具,
我們最終就會摧毀我們自己。
00:54
That's what happens
10
54390
1181
00:55
when you're on a power trip and nothing else.
11
55595
2787
如果你只想利用科技 來玩權弄勢就會這樣。
00:59
So the idealism
12
59509
3389
在那時候,
01:02
of digital culture back then
13
62922
4809
數位文化的理想主義
01:07
was all about starting with that recognition of the possible darkness
14
67755
4739
談的都是,一開始就要我們認清 科技可能會帶來的黑暗面,
01:12
and trying to imagine a way to transcend it
15
72518
3350
並嘗試用美及創意的方式
01:15
with beauty and creativity.
16
75892
2578
來突破這些黑暗面。
01:19
I always used to end my early TED Talks with a rather horrifying line, which is,
17
79033
6507
早期在 TED 演講時,我的結尾 總是一句蠻嚇人的台詞:
01:26
"We have a challenge.
18
86478
3866
「我們有一個挑戰。
01:30
We have to create a culture around technology
19
90368
4024
我們得為科技創造出一種文化,
01:34
that is so beautiful, so meaningful,
20
94416
3968
一種相當美麗、有意義的文化、
01:38
so deep, so endlessly creative,
21
98408
2541
一種相當深刻、有無盡創意、
01:40
so filled with infinite potential
22
100973
3016
充滿無限潛力的科技文化,
01:44
that it draws us away from committing mass suicide."
23
104013
3253
這樣的科技文化才能讓我們 避免掉集體自殺。」
01:48
So we talked about extinction as being one and the same
24
108519
5588
我們把人類滅絕當作一回事, 就如同我們在談,
01:54
as the need to create an alluring, infinitely creative future.
25
114131
4830
我們需要創造出一個誘人且 有著無限創意未來一樣的重要。
01:59
And I still believe that that alternative of creativity
26
119639
5382
我仍然相信,創新的另一面
02:05
as an alternative to death
27
125045
1974
有可能就是死亡,
02:07
is very real and true,
28
127043
1969
這個說法是非常真實的,
02:09
maybe the most true thing there is.
29
129036
1983
也許是世上最真實的。
02:11
In the case of virtual reality --
30
131870
2095
就虛擬實境來說,
02:13
well, the way I used to talk about it
31
133989
2282
我以前談論它的方式,
02:16
is that it would be something like
32
136295
2635
會把它說成像是
02:18
what happened when people discovered language.
33
138954
2850
人類剛發明語言時一樣。
02:21
With language came new adventures, new depth, new meaning,
34
141828
4675
隨著語言,出現了新的冒險、 新的深度、新的意義、
02:26
new ways to connect, new ways to coordinate,
35
146527
2080
新的連結方式、新的協調方式、
02:28
new ways to imagine, new ways to raise children,
36
148631
4034
新的想像方式、 新的養育孩子方式;
02:32
and I imagined, with virtual reality, we'd have this new thing
37
152689
4262
我在想,有了虛擬實境, 我們就會創造出一種
02:36
that would be like a conversation
38
156975
1593
新的對話方式,
02:38
but also like waking-state intentional dreaming.
39
158592
3344
但又像是在清醒的狀態下, 刻意作夢一樣。
02:41
We called it post-symbolic communication,
40
161960
2653
我們稱之為「後象徵性溝通」,
02:44
because it would be like just directly making the thing you experienced
41
164637
4358
因為那就像你直接的親身體驗,
02:49
instead of indirectly making symbols to refer to things.
42
169019
3619
而不只是間接地看到事物的表象。
02:53
It was a beautiful vision, and it's one I still believe in,
43
173466
4338
那是個很美的遠景, 且是我現在仍然相信的遠景,
02:57
and yet, haunting that beautiful vision
44
177828
3215
但,圍繞在這美好景像的黑暗面
03:01
was the dark side of how it could also turn out.
45
181067
3150
也有可能會發生。
03:04
And I suppose I could mention
46
184241
5048
我想,我可以談談,
03:09
from one of the very earliest computer scientists,
47
189313
3064
很早期的一位電腦科學家,
03:12
whose name was Norbert Wiener,
48
192401
2135
他的名字叫諾伯特維納,
03:14
and he wrote a book back in the '50s, from before I was even born,
49
194560
3754
在五○年代我都還沒 出生時,他寫了一本書,
03:18
called "The Human Use of Human Beings."
50
198338
2658
書名叫《人有人的用處》。
03:21
And in the book, he described the potential
51
201779
4172
在書中,他描述到我們是有可能
03:25
to create a computer system that would be gathering data from people
52
205975
6181
會創造出一個 收集人類資料的電腦,
03:32
and providing feedback to those people in real time
53
212180
3572
並提供即時回應給人類的電腦,
03:35
in order to put them kind of partially, statistically, in a Skinner box,
54
215776
5135
為了統計這些行為, 這有點像是把人類放到施金納箱中
03:40
in a behaviorist system,
55
220935
2444
一種控制動物行為的實驗箱,
03:43
and he has this amazing line where he says,
56
223403
2501
他說了一句很棒的話,
03:45
one could imagine, as a thought experiment --
57
225928
2738
大家可以想像有一種思想的實驗──
03:48
and I'm paraphrasing, this isn't a quote --
58
228690
2461
我現在要講的是改述過的釋義, 不是引述──
03:51
one could imagine a global computer system
59
231175
3080
大家可以想像, 有一個全球的電腦系統,
03:54
where everybody has devices on them all the time,
60
234279
2842
在此系統中,每個人身上 時時刻刻都有一些裝置,
03:57
and the devices are giving them feedback based on what they did,
61
237145
3272
這些裝置會根據 他們的行為給出回饋,
04:00
and the whole population
62
240441
1875
而在系統裡的全部人,
04:02
is subject to a degree of behavior modification.
63
242340
3576
都會受到某種程度的行為修正。
04:05
And such a society would be insane,
64
245940
3546
這樣子的社會簡直太瘋狂了,
04:09
could not survive, could not face its problems.
65
249510
3097
這樣的社會無法生存, 無法面對它的問題。
04:12
And then he says, but this is only a thought experiment,
66
252631
2621
接著,他說,但這只是個思想實驗,
04:15
and such a future is technologically infeasible.
67
255276
3420
將來要把人類丟到實驗箱, 以科學的角度而言,我們也辦不到。
04:18
(Laughter)
68
258720
1092
(笑聲)
04:19
And yet, of course, it's what we have created,
69
259836
3002
但,沒錯,我們人類現在 就已經陷入這樣的窘境,
04:22
and it's what we must undo if we are to survive.
70
262862
3277
而如果人類想要生存, 就要趕緊回頭。
04:27
So --
71
267457
1151
所以,
04:28
(Applause)
72
268632
3540
(掌聲)
04:32
I believe that we made a very particular mistake,
73
272631
5977
我認為,我們犯了一個 非常特殊的錯誤,
04:38
and it happened early on,
74
278632
2234
它在很早就發生了,
04:40
and by understanding the mistake we made,
75
280890
2074
而透過了解我們所犯下的錯誤,
04:42
we can undo it.
76
282988
1859
我們就能將它還原。
04:44
It happened in the '90s,
77
284871
2559
事情發生在九○年代,
04:47
and going into the turn of the century,
78
287454
2742
正要進入世紀的轉折點,
04:50
and here's what happened.
79
290220
1388
發生的經過如下。
04:53
Early digital culture,
80
293200
1374
早期的數位文化,
04:54
and indeed, digital culture to this day,
81
294598
4972
當然,還有至今的數位文化,
04:59
had a sense of, I would say, lefty, socialist mission about it,
82
299594
6309
有一種……我會說是左翼、 社會主義使命的感覺,
05:05
that unlike other things that have been done,
83
305927
2160
這不像其它已經有的東西,
05:08
like the invention of books,
84
308111
1434
比如書籍的發明,
05:09
everything on the internet must be purely public,
85
309569
3413
在網路上的一切都 必須要是完全公開的,
05:13
must be available for free,
86
313006
2325
必須要可以免費使用,
05:15
because if even one person cannot afford it,
87
315355
3388
因為,如果有一個人負擔不起,
05:18
then that would create this terrible inequity.
88
318767
2572
那就會造成很糟的不平等。
05:21
Now of course, there's other ways to deal with that.
89
321912
2524
當然,有其它方法 可以處理這個問題。
05:24
If books cost money, you can have public libraries.
90
324460
3016
如果書籍要錢, 你可以用公共圖書館。
05:27
And so forth.
91
327500
1174
諸如此類。
05:28
But we were thinking, no, no, no, this is an exception.
92
328698
2618
但我們在想,不、不、不, 這是個例外。
05:31
This must be pure public commons, that's what we want.
93
331340
4605
它必須要是單純地能讓公眾 使用的,我們希望如此。
05:35
And so that spirit lives on.
94
335969
2634
如此,精神才能傳承下去。
05:38
You can experience it in designs like the Wikipedia, for instance,
95
338627
3715
你可以體驗這些設計, 比如像維基百科
及其它許多這類的設計。
05:42
many others.
96
342366
1341
05:43
But at the same time,
97
343731
1874
但同時,
05:45
we also believed, with equal fervor,
98
345629
2588
我們也能帶著同等的熱情,
05:48
in this other thing that was completely incompatible,
99
348241
3937
相信另外一種完全不一樣的事情,
05:52
which is we loved our tech entrepreneurs.
100
352202
3627
那就是,我們愛我們的科技企業家。
05:55
We loved Steve Jobs; we loved this Nietzschean myth
101
355853
3739
我們愛史帝夫賈伯斯, 我們愛這種尼采般的神話,
05:59
of the techie who could dent the universe.
102
359616
3468
這些能改變世界的科技天才。
06:03
Right?
103
363108
1318
對嗎?
06:04
And that mythical power still has a hold on us, as well.
104
364450
5848
而那神話般的力量 仍能持續地獲得我們的支持。
06:10
So you have these two different passions,
105
370322
4459
所以,你會有這兩種不同的熱忱,
06:14
for making everything free
106
374805
1937
一種是讓一切都免費,
06:16
and for the almost supernatural power of the tech entrepreneur.
107
376766
5166
另一種是科技企業家的力量, 近乎超自然的力量。
06:21
How do you celebrate entrepreneurship when everything's free?
108
381956
4352
但當一切都是免費時, 你要如何讚頌企業家精神?
06:26
Well, there was only one solution back then,
109
386332
3125
在當時,只有一個解決方案,
06:29
which was the advertising model.
110
389481
2087
那就是廣告獲利模式。
06:31
And so therefore, Google was born free, with ads,
111
391592
4003
因此,Google 剛開始 是免費的,但附帶廣告。
06:35
Facebook was born free, with ads.
112
395619
3682
臉書剛開始也是 免費的,但附帶廣告。
06:39
Now in the beginning, it was cute,
113
399325
3865
一開始,這還蠻討喜的,
06:43
like with the very earliest Google.
114
403214
1960
就像最早期的 Google。
06:45
(Laughter)
115
405198
1286
(笑聲)
06:46
The ads really were kind of ads.
116
406508
2897
廣告真的就只是廣告。
06:49
They would be, like, your local dentist or something.
117
409429
2485
廣告可能就是你當地的牙醫之類的。
06:51
But there's thing called Moore's law
118
411938
1920
但有樣東西叫做摩爾定律,
06:53
that makes the computers more and more efficient and cheaper.
119
413882
3142
它讓電腦越來越高效 也越來越便宜。
06:57
Their algorithms get better.
120
417048
1858
演算法也越來越強。
06:58
We actually have universities where people study them,
121
418930
2596
在大學裡真的有人在研究它們,
07:01
and they get better and better.
122
421550
1628
且它們越來越好。
07:03
And the customers and other entities who use these systems
123
423202
4452
客戶和使用這些系統的其它機構
07:07
just got more and more experienced and got cleverer and cleverer.
124
427678
4127
變得越來越有經驗, 且越來越聰明。
07:11
And what started out as advertising
125
431829
2397
一開始本來只是廣告,
07:14
really can't be called advertising anymore.
126
434250
2477
現在真的不能再稱為廣告了。
07:16
It turned into behavior modification,
127
436751
2912
它轉變成了「行為修改」。
07:19
just as Norbert Wiener had worried it might.
128
439687
4493
這就是諾伯特維納所擔心的。
07:24
And so I can't call these things social networks anymore.
129
444204
4620
所以我已經無法再稱 這些東西為社交網路了。
07:28
I call them behavior modification empires.
130
448848
3814
我稱它們為「行為修改帝國」。
07:32
(Applause)
131
452686
2235
(掌聲)
07:34
And I refuse to vilify the individuals.
132
454945
4214
我是反對誹謗個人的。
07:39
I have dear friends at these companies,
133
459183
2271
在這些公司中有我親愛的朋友,
07:41
sold a company to Google, even though I think it's one of these empires.
134
461478
4760
我們也曾把一間公司賣給 Google, 即使我認為 Google 也是帝國之一。
07:46
I don't think this is a matter of bad people who've done a bad thing.
135
466262
5060
我不認為這是壞人做了壞事的問題。
07:51
I think this is a matter of a globally tragic,
136
471346
4576
我認為這一場全球性的悲劇,
07:55
astoundingly ridiculous mistake,
137
475946
4572
非常荒謬的錯誤,
08:00
rather than a wave of evil.
138
480542
4129
而不是邪惡的浪潮。
08:04
Let me give you just another layer of detail
139
484695
2682
讓我再做深一層的細節說明,
08:07
into how this particular mistake functions.
140
487401
3103
解釋這個錯誤是如何產生的。
08:11
So with behaviorism,
141
491337
2707
行為主義是這樣的,
08:14
you give the creature, whether it's a rat or a dog or a person,
142
494068
5064
不論是哪種生物, 比如老鼠、狗,或是人,
08:19
little treats and sometimes little punishments
143
499156
2840
它會根據生物的行為
回饋一點點甜頭或懲罰,
08:22
as feedback to what they do.
144
502020
1817
08:24
So if you have an animal in a cage, it might be candy and electric shocks.
145
504710
5912
如果你把一隻動物放在籠子中, 你給牠的可能就是糖果和電擊。
08:30
But if you have a smartphone,
146
510646
2524
但如果你有一支智慧手機,
08:33
it's not those things, it's symbolic punishment and reward.
147
513194
6926
手機雖然不像那些實驗箱, 但也有象徵性的懲罰和獎賞。
08:40
Pavlov, one of the early behaviorists,
148
520144
2443
巴夫洛夫是最早的行為學家之一,
08:42
demonstrated the famous principle.
149
522611
2952
他提出了著名的原則。
08:45
You could train a dog to salivate just with the bell, just with the symbol.
150
525587
3961
你只要用一個鈴噹或手勢, 就可以訓練一隻狗流口水。
08:49
So on social networks,
151
529572
1586
在社交網路上,
08:51
social punishment and social reward function as the punishment and reward.
152
531182
5080
有人會酸你或給你按讚 就像懲罰及獎賞一樣。
08:56
And we all know the feeling of these things.
153
536286
2077
我們都知道懲罰和獎賞的感受如何。
08:58
You get this little thrill --
154
538387
1451
你會有點興奮,
08:59
"Somebody liked my stuff and it's being repeated."
155
539862
2350
「有人喜歡我的東西, 且重複按讚。」
09:02
Or the punishment: "Oh my God, they don't like me,
156
542236
2334
或被懲罰,「喔,天啊,他們不喜歡我,
09:04
maybe somebody else is more popular, oh my God."
157
544594
2239
也許別人比較受歡迎,喔,天啊。」
09:06
So you have those two very common feelings,
158
546857
2226
你會有這兩種很常見的感受,
09:09
and they're doled out in such a way that you get caught in this loop.
159
549107
3564
就這樣一點一點地 把你困在這迴圈中。
09:12
As has been publicly acknowledged by many of the founders of the system,
160
552695
4095
這個系統的許多創始者 都已經公開承認這個現象,
09:16
everybody knew this is what was going on.
161
556814
2341
人人都知道發生的狀況就是如此。
09:19
But here's the thing:
162
559871
1619
但,重點是:
09:21
traditionally, in the academic study of the methods of behaviorism,
163
561514
5294
傳統行為主義方法的學術研究
09:26
there have been comparisons of positive and negative stimuli.
164
566832
5436
比較正面和負面的刺激。
09:32
In this setting, a commercial setting,
165
572292
2364
在這樣的前提下, 有商業行為的前提下,
09:34
there's a new kind of difference
166
574680
1596
會產生一種新的差異,
09:36
that has kind of evaded the academic world for a while,
167
576300
2769
有好一段時間它都沒被學術界發現,
09:39
and that difference is that whether positive stimuli
168
579093
4048
那差異就是,在不同的情況下,
正面刺激是否比 負面刺激更有效之類的....
09:43
are more effective than negative ones in different circumstances,
169
583165
3309
09:46
the negative ones are cheaper.
170
586498
2104
結果,負面刺激比較便宜,
09:48
They're the bargain stimuli.
171
588626
2056
用負面刺激很划算。
09:50
So what I mean by that is it's much easier
172
590706
5703
我這麼說的意思是
失去信任比建立信任容易。
09:56
to lose trust than to build trust.
173
596433
3116
09:59
It takes a long time to build love.
174
599573
3172
我們要花很長的時間才能建立「愛」。
10:02
It takes a short time to ruin love.
175
602769
2606
但只要很短暫的時間就能毀了「愛」。
10:05
Now the customers of these behavior modification empires
176
605399
4588
這些「行為修改帝國」的客戶
深陷在非常快的迴圈中。
10:10
are on a very fast loop.
177
610011
1423
10:11
They're almost like high-frequency traders.
178
611458
2045
他們幾乎就像是股票的高頻交易者。
10:13
They're getting feedbacks from their spends
179
613527
2024
他們從客戶的消費和動作獲得回饋,
10:15
or whatever their activities are if they're not spending,
180
615575
2802
從而知道哪些效果好,
10:18
and they see what's working, and then they do more of that.
181
618401
3270
就會更那樣做。
10:21
And so they're getting the quick feedback,
182
621695
2040
因為那樣,他們會快速得到回饋,
10:23
which means they're responding more to the negative emotions,
183
623759
3040
也就是說他們對 負面情緒比較有反應,
10:26
because those are the ones that rise faster, right?
184
626823
3937
因為這些負面刺激的回饋, 比正向刺激來得快,對吧?
10:30
And so therefore, even well-intentioned players
185
630784
3548
因此,即使是出發點很好的業者,
10:34
who think all they're doing is advertising toothpaste
186
634356
2865
他們認為他們所做的 不過就是為牙膏打廣告,
10:37
end up advancing the cause of the negative people,
187
637245
3031
結果卻是協助造成了這社會上
10:40
the negative emotions, the cranks,
188
640300
2334
充滿了負面情緒的人、怪胎、
10:42
the paranoids,
189
642658
1444
偏執狂、
10:44
the cynics, the nihilists.
190
644126
3080
憤世嫉俗、對人生無望的人。
10:47
Those are the ones who get amplified by the system.
191
647230
3493
系統會放大的就是這些人。
10:50
And you can't pay one of these companies to make the world suddenly nice
192
650747
5651
你無法支付其中任何一家公司
讓世界突然變好或民主進步,
10:56
and improve democracy
193
656422
1151
10:57
nearly as easily as you can pay to ruin those things.
194
657597
3841
無法像破壞這些東西那樣容易。
11:01
And so this is the dilemma we've gotten ourselves into.
195
661462
3719
所以,是我們自己 造成了這樣的困境。
11:05
The alternative is to turn back the clock, with great difficulty,
196
665856
5232
替代方案就是 盡全力地讓時光倒流,
11:11
and remake that decision.
197
671112
2841
然後重新做決定。
11:13
Remaking it would mean two things.
198
673977
4038
重新做決定意味著兩件事。
11:18
It would mean first that many people, those who could afford to,
199
678039
3928
第一,許多負擔得起這些東西的人
11:21
would actually pay for these things.
200
681991
2207
就真的得要為這些東西付錢。
11:24
You'd pay for search, you'd pay for social networking.
201
684222
4407
搜尋要錢、用社交網路要錢。
11:28
How would you pay? Maybe with a subscription fee,
202
688653
3461
你要如何付錢?也許是付訂閱費,
11:32
maybe with micro-payments as you use them.
203
692138
2738
也許是在使用時支付極低的費用。
11:34
There's a lot of options.
204
694900
1802
有許多選擇。
11:36
If some of you are recoiling, and you're thinking,
205
696726
2397
如果有些人打退堂鼓,在想:
11:39
"Oh my God, I would never pay for these things.
206
699147
2366
「天啊,我絕不會為這些東西付錢。
11:41
How could you ever get anyone to pay?"
207
701537
2095
你怎麼能要任何人付錢?」
11:43
I want to remind you of something that just happened.
208
703656
3239
那麼我要提醒你一件剛發生的事。
11:46
Around this same time
209
706919
2054
之前 Google、臉書這些公司
11:48
that companies like Google and Facebook were formulating their free idea,
210
708997
5707
在發想他們的免費想法時,
11:54
a lot of cyber culture also believed that in the future,
211
714728
4504
有許多網路文化也相信,在未來,
11:59
televisions and movies would be created in the same way,
212
719256
3022
我們也會用同樣的方式 來製作電視和電影,
12:02
kind of like the Wikipedia.
213
722302
1755
有點像維基百科。
12:04
But then, companies like Netflix, Amazon, HBO,
214
724456
5064
但,接著,像網飛、 亞馬遜、HBO 這類公司
12:09
said, "Actually, you know, subscribe. We'll give you give you great TV."
215
729544
3739
比如:「你只要訂閱我們, 我們就給你很好的節目。」
12:13
And it worked!
216
733307
1373
結果很有效!
12:14
We now are in this period called "peak TV," right?
217
734704
3874
我們現在處在所謂的 「電視節目選擇超多」的時期,對吧?
12:18
So sometimes when you pay for stuff, things get better.
218
738602
4198
所以,有時為東西付錢反而是好事。
12:22
We can imagine a hypothetical --
219
742824
2286
我們可以想像一個假設性的──
12:25
(Applause)
220
745134
4671
(掌聲)
12:29
We can imagine a hypothetical world of "peak social media."
221
749829
3659
我們可以想像一個 社交媒體超多的世界。
12:33
What would that be like?
222
753512
1349
那會是什麼樣子?
12:34
It would mean when you get on, you can get really useful,
223
754885
2770
那意味著,當你上社交媒體, 你能得到非常有用、
12:37
authoritative medical advice instead of cranks.
224
757679
3095
有權威性的醫療建議, 而不是亂七八糟的垃圾。
12:41
It could mean when you want to get factual information,
225
761143
3310
也可能意味著, 當你想要取得真實資訊時,
12:44
there's not a bunch of weird, paranoid conspiracy theories.
226
764477
3254
不會得到一堆怪異、 偏執的陰謀論。
12:47
We can imagine this wonderful other possibility.
227
767755
4235
我們可以想像這美好的 「另一種可能性」。
12:52
Ah.
228
772014
1261
啊。
12:53
I dream of it. I believe it's possible.
229
773299
2130
我夢想它能成真。 我相信它能成真。
12:55
I'm certain it's possible.
230
775453
3302
我很確定它能成真。
12:58
And I'm certain that the companies, the Googles and the Facebooks,
231
778779
4747
我很確定像 Google 及臉書這些公司
13:03
would actually do better in this world.
232
783550
2312
他們會做得更好。
13:05
I don't believe we need to punish Silicon Valley.
233
785886
3166
我不認為我們需要去懲罰矽谷。
13:09
We just need to remake the decision.
234
789076
2253
我們只需要重新決定。
13:12
Of the big tech companies,
235
792702
1882
在大型科技公司中,
13:14
it's really only two that depend on behavior modification and spying
236
794608
5563
其實只有兩間 仰賴行為修改和暗中監視
13:20
as their business plan.
237
800195
1257
做為它們的事業計畫。
13:21
It's Google and Facebook.
238
801476
1759
就是 Google 和臉書。
13:23
(Laughter)
239
803259
1310
(笑聲)
13:24
And I love you guys.
240
804593
1691
我愛你們。
13:26
Really, I do. Like, the people are fantastic.
241
806308
2721
真的,我愛你們。那些人都很棒。
13:30
I want to point out, if I may,
242
810371
3182
如果可以的話,我想聲明這點。
13:33
if you look at Google,
243
813577
1151
比如說 Google,
13:34
they can propagate cost centers endlessly with all of these companies,
244
814752
5087
他們可以和這些公司 無限地將成本中心分散出去,
13:39
but they cannot propagate profit centers.
245
819863
2048
但他們無法將利益中心散播出去。
13:41
They cannot diversify, because they're hooked.
246
821935
3181
他們無法多樣化, 因為他們被困住了。
13:45
They're hooked on this model, just like their own users.
247
825140
2627
他們被固定在這個模式上, 就如同他們的使用者一樣。
13:47
They're in the same trap as their users,
248
827791
2298
他們和他們的使用者 掉到了同樣的困境中;
13:50
and you can't run a big corporation that way.
249
830113
2504
但是大型的企業不能那樣經營。
13:52
So this is ultimately totally in the benefit of the shareholders
250
832641
3603
因為,他們最終追求的是股東的利益,
13:56
and other stakeholders of these companies.
251
836268
2445
這些公司的股東權益。
13:58
It's a win-win solution.
252
838737
2350
這樣才是雙贏的解決方案。
14:01
It'll just take some time to figure it out.
253
841111
2515
只是要花些時間就能想通這點。
14:03
A lot of details to work out,
254
843650
2262
有很多細節需要考量,
14:05
totally doable.
255
845936
1830
完全是可行的。
14:07
(Laughter)
256
847790
2415
(笑聲)
14:10
I don't believe our species can survive unless we fix this.
257
850229
3834
我認為若不解決這個問題, 人類就無法生存。
14:14
We cannot have a society
258
854087
2290
我們不能活在這樣的社會:
14:16
in which, if two people wish to communicate,
259
856401
2961
如果兩個人想要溝通,
14:19
the only way that can happen is if it's financed by a third person
260
859386
3440
唯一可能發生的方式是
14:22
who wishes to manipulate them.
261
862850
2346
由想要操縱他們的第三方提供資金。
14:25
(Applause)
262
865220
6238
(掌聲)
14:35
(Applause ends)
263
875077
1151
14:36
In the meantime, if the companies won't change,
264
876942
2945
到那時候,如果這些公司仍不改變,
14:39
delete your accounts, OK?
265
879911
1666
刪除你的帳號,好嗎?
14:41
(Laughter)
266
881601
1269
(笑聲)
14:42
(Applause)
267
882894
1046
(掌聲)
14:43
That's enough for now.
268
883964
1509
我就講到這邊。
14:45
Thank you so much.
269
885497
1151
非常謝謝。
14:46
(Applause)
270
886672
6804
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7