Why fascism is so tempting -- and how your data could power it | Yuval Noah Harari

760,125 views ・ 2018-06-08

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: Yanyan Hong
00:12
Hello, everyone.
0
12968
1150
哈囉,大家好。
00:15
It's a bit funny, because I did write that humans will become digital,
1
15562
4936
這有點好笑,因為我的確寫過 人類將來會變成數位的,
00:20
but I didn't think it will happen so fast
2
20522
2714
但我當時沒料到會這麼快發生,
00:23
and that it will happen to me.
3
23260
1659
且發生在我身上。
00:25
But here I am, as a digital avatar,
4
25649
2682
但我現在就以 數位人像的身份站在這裡,
00:28
and here you are, so let's start.
5
28355
2666
你們也都就座了,所以開始吧。
00:32
And let's start with a question.
6
32070
2133
咱們從一個問題開始。
00:34
How many fascists are there in the audience today?
7
34823
3534
今天觀眾席中, 有多少人是法西斯主義者?
00:38
(Laughter)
8
38381
1450
(笑聲)
00:39
Well, it's a bit difficult to say,
9
39855
2349
嗯,這有點難說,
00:42
because we've forgotten what fascism is.
10
42228
3467
因為我們已忘了法西斯主義是什麼。
00:46
People now use the term "fascist"
11
46498
2579
現代人使用「法西斯主義的」一詞
00:49
as a kind of general-purpose abuse.
12
49101
3008
通常是指某種一般性目的的傷害。
00:52
Or they confuse fascism with nationalism.
13
52133
3947
或是他們把法西斯主義 和國家主義搞混了。
00:56
So let's take a few minutes to clarify what fascism actually is,
14
56863
5388
所以,咱們先花幾分鐘時間, 澄清一下法西斯主義到底是什麼,
01:02
and how it is different from nationalism.
15
62275
3120
以及它和國家主義有什麼不同。
01:05
The milder forms of nationalism have been among the most benevolent
16
65838
5041
國家主義的溫和形式
一直都是人類最仁慈的產物之一。
01:10
of human creations.
17
70903
1458
01:12
Nations are communities of millions of strangers
18
72990
4111
國家就是數百萬名陌生人 所組成的共同體,
01:17
who don't really know each other.
19
77125
1713
這些人並不認識彼此。
01:19
For example, I don't know the eight million people
20
79489
3405
比如,我並不認識另外八百萬名
01:22
who share my Israeli citizenship.
21
82918
2556
擁有以色列公民身份的人。
01:26
But thanks to nationalism,
22
86109
2167
但多虧了國家主義,
01:28
we can all care about one another and cooperate effectively.
23
88300
3806
我們都在乎彼此,並能有效地合作。
01:32
This is very good.
24
92450
1200
這是非常好的。
01:34
Some people, like John Lennon, imagine that without nationalism,
25
94093
5738
像約翰藍儂(John Lennon), 有些人臆測若沒有國家主義,
01:39
the world will be a peaceful paradise.
26
99855
3790
世界會是個和平的天堂。
01:44
But far more likely,
27
104010
1428
但更有可能的是,
01:45
without nationalism, we would have been living in tribal chaos.
28
105462
4777
若沒有國家主義, 我們會生活在部落的混亂中。
01:50
If you look today at the most prosperous and peaceful countries in the world,
29
110684
5135
如果你們想想看現今世界上 最繁榮、和平的國家,
01:55
countries like Sweden and Switzerland and Japan,
30
115843
4290
比如瑞典、瑞士和日本,
02:00
you will see that they have a very strong sense of nationalism.
31
120157
4963
就會發現它們都有 很強的國家主義感。
02:05
In contrast, countries that lack a strong sense of nationalism,
32
125750
4160
相對地,缺乏強烈 國家主義感的國家,
02:09
like Congo and Somalia and Afghanistan,
33
129934
3421
比如剛果、索馬利亞,及阿富汗,
02:13
tend to be violent and poor.
34
133379
2174
都傾向較暴力和貧窮。
02:16
So what is fascism, and how is it different from nationalism?
35
136855
4138
所以,法西斯主義是什麼? 它和國家主義有什麼不同?
02:21
Well, nationalism tells me that my nation is unique,
36
141941
5403
嗯,國家主義告訴我, 我的國家是獨一無二的,
02:27
and that I have special obligations towards my nation.
37
147368
3801
且我對於我的國家負有特殊的義務。
02:31
Fascism, in contrast, tells me that my nation is supreme,
38
151641
6206
相對地,法西斯主義告訴我, 我的國家是優越的,
02:37
and that I have exclusive obligations towards it.
39
157871
3912
且我對於我國家的義務是唯一的。
02:42
I don't need to care about anybody or anything other than my nation.
40
162895
4831
我只需要在乎我的國家, 其他人事物都不重要。
02:48
Usually, of course, people have many identities
41
168465
3603
當然,通常人對於不同的群體會有
02:52
and loyalties to different groups.
42
172092
2422
許多不同的身份和忠誠度。
02:55
For example, I can be a good patriot, loyal to my country,
43
175036
4804
比如,我可能是個愛國者, 對我的國家很忠誠,
02:59
and at the same time, be loyal to my family,
44
179864
3238
同時,我也忠於我的家庭、
03:03
my neighborhood, my profession,
45
183126
2361
我的鄰里、我的職業、
03:05
humankind as a whole,
46
185511
1460
全體人類、
03:06
truth and beauty.
47
186995
1604
真相以及美好。
03:09
Of course, when I have different identities and loyalties,
48
189370
4271
當然,當我有 不同的身份和忠誠度時,
03:13
it sometimes creates conflicts and complications.
49
193665
3333
有時就會產生出衝突和複雜。
03:17
But, well, who ever told you that life was easy?
50
197506
3634
但,誰說人生是容易的呢?
03:21
Life is complicated.
51
201649
2040
人生是複雜的。
03:23
Deal with it.
52
203713
1150
想辦法處理它。
03:26
Fascism is what happens when people try to ignore the complications
53
206331
6000
法西斯主義之所以會發生, 就是因為人們試圖忽略複雜,
03:32
and to make life too easy for themselves.
54
212355
3109
把他們自己的人生變得太輕鬆簡單。
03:36
Fascism denies all identities except the national identity
55
216128
5572
法西斯主義否認 國家身份以外的所有身份,
03:41
and insists that I have obligations only towards my nation.
56
221724
5109
並堅持我只對我的國家有義務。
03:47
If my nation demands that I sacrifice my family,
57
227421
4104
如果我的國家需要我犧牲我的家人,
03:51
then I will sacrifice my family.
58
231549
2436
我就會犧牲我的家人。
03:54
If the nation demands that I kill millions of people,
59
234009
4274
如果國家需要我殺掉數百萬個人,
03:58
then I will kill millions of people.
60
238307
2658
我就會殺掉數百萬個人。
04:01
And if my nation demands that I betray truth and beauty,
61
241442
6075
如果我的國家需要我 背叛真相和美好,
04:07
then I should betray truth and beauty.
62
247541
3023
我就該背叛真相和美好。
04:11
For example, how does a fascist evaluate art?
63
251587
4526
比如,法西斯主義者 要如何評鑑藝術?
04:16
How does a fascist decide whether a movie is a good movie or a bad movie?
64
256615
5289
法西斯主義者要如何 決定一部電影的優劣?
04:22
Well, it's very, very, very simple.
65
262728
4172
答案非常、非常、非常簡單。
04:27
There is really just one yardstick:
66
267260
1928
衡量標準只有一種:
04:29
if the movie serves the interests of the nation,
67
269593
3476
如果電影是為國家的利益著想,
04:33
it's a good movie;
68
273093
1444
它就是部好電影;
04:34
if the movie doesn't serve the interests of the nation,
69
274561
3183
如果電影沒有為國家的利益著想,
04:37
it's a bad movie.
70
277768
1222
它就是部爛電影。
04:39
That's it.
71
279014
1150
就這樣。
04:40
Similarly, how does a fascist decide what to teach kids in school?
72
280496
4888
同樣地,法西斯主義者要如何 決定在學校要教孩子什麼內容?
04:45
Again, it's very simple.
73
285877
2087
答案也非常簡單。
04:47
There is just one yardstick:
74
287988
1945
衡量標準只有一種:
04:49
you teach the kids whatever serves the interests of the nation.
75
289957
4945
不論你教孩子什麼, 只要對國家有利就對了。
04:55
The truth doesn't matter at all.
76
295323
2587
真相完全不重要。
05:00
Now, the horrors of the Second World War and of the Holocaust remind us
77
300372
5426
二次大戰和大屠殺的恐怖, 讓我們想起
05:05
of the terrible consequences of this way of thinking.
78
305822
4133
這種思維方式的可怖後果。
05:10
But usually, when we talk about the ills of fascism,
79
310508
4545
但通常,當我們談到 法西斯主義的不好之處時,
05:15
we do so in an ineffective way,
80
315077
2960
我們會用無效的方式來談,
05:18
because we tend to depict fascism as a hideous monster,
81
318061
4738
因為我們傾向會把法西斯主義 描繪成一隻可怕的怪獸,
05:22
without really explaining what was so seductive about it.
82
322823
3906
而不會真正去解釋 它有什麼誘人之處。
05:27
It's a bit like these Hollywood movies that depict the bad guys --
83
327284
4714
這就有點像好萊塢電影 描繪這些反派的方式——
05:32
Voldemort or Sauron or Darth Vader --
84
332022
4023
佛地魔、索倫,或達斯維德——
05:36
as ugly and mean and cruel.
85
336069
2632
醜陋、卑鄙,且殘酷。
05:38
They're cruel even to their own supporters.
86
338725
2538
他們甚至對自己的支持者也很殘酷。
05:41
When I see these movies, I never understand --
87
341670
3621
當我看這些電影時, 我始終無法理解,
05:45
why would anybody be tempted to follow a disgusting creep like Voldemort?
88
345315
6505
為什麼會有人被誘惑去追隨 佛地魔這種讓人討厭的卑鄙小人?
05:52
The problem with evil is that in real life,
89
352895
3516
邪惡的問題在於,在真實生活中,
05:56
evil doesn't necessarily look ugly.
90
356435
2891
邪惡的外表不見得是醜陋的。
05:59
It can look very beautiful.
91
359688
1892
它可能看起來十分美好。
06:02
This is something that Christianity knew very well,
92
362354
2723
基督教就非常清楚知道這一點,
06:05
which is why in Christian art, as [opposed to] Hollywood,
93
365101
3706
這就是為什麼基督教藝術 和好萊塢相反,
06:08
Satan is usually depicted as a gorgeous hunk.
94
368831
3868
撒旦通常被描繪成 英俊且性感的男人。
06:13
This is why it's so difficult to resist the temptations of Satan,
95
373259
4596
那就是為什麼要拒絕 撒旦的誘惑是很困難的,
06:17
and why it is also difficult to resist the temptations of fascism.
96
377879
4744
也是為什麼要拒絕 法西斯主義的誘惑也很困難。
06:22
Fascism makes people see themselves
97
382980
2722
法西斯主義讓人們 能夠感受到他們自己
06:25
as belonging to the most beautiful and most important thing in the world --
98
385726
5243
隸屬於世界上最美好、 最重要的東西——
06:30
the nation.
99
390993
1302
國家。
06:32
And then people think,
100
392319
1346
接著,人們會想:
06:33
"Well, they taught us that fascism is ugly.
101
393689
3189
「嗯,他們教我們 法西斯主義是醜陋的。
06:37
But when I look in the mirror, I see something very beautiful,
102
397442
3024
但當我看向鏡子, 我看到的卻是美好的東西,
06:40
so I can't be a fascist, right?"
103
400490
2191
所以我不可能是法西斯主義者吧?」
06:43
Wrong.
104
403022
1151
錯。
06:44
That's the problem with fascism.
105
404197
1547
那就是法西斯主義的問題。
06:45
When you look in the fascist mirror,
106
405768
2619
當你看向法西斯主義的鏡子,
06:48
you see yourself as far more beautiful than you really are.
107
408411
4534
你看見的自己, 比實際上的還要美麗許多。
06:53
In the 1930s, when Germans looked in the fascist mirror,
108
413371
4293
在 30 年代,當德國人 看向法西斯主義的鏡子時,
06:57
they saw Germany as the most beautiful thing in the world.
109
417688
4040
他們看到的是: 德國是世界上最美好的東西。
07:02
If today, Russians look in the fascist mirror,
110
422244
3436
換到現今,若俄國人 看向法西斯主義的鏡子,
07:05
they will see Russia as the most beautiful thing in the world.
111
425704
3559
他們會看到: 俄國是世界上最美好的東西。
07:09
And if Israelis look in the fascist mirror,
112
429577
3128
如果以色列人看向 法西斯主義的鏡子,
07:12
they will see Israel as the most beautiful thing in the world.
113
432729
4380
他們會看到: 以色列是世界上最美好的東西。
07:18
This does not mean that we are now facing a rerun of the 1930s.
114
438657
5057
這並不表示我們現在 面臨到 30 年代的重演。
07:24
Fascism and dictatorships might come back,
115
444304
3475
法西斯主義和獨裁專政 有可能會回來,
07:27
but they will come back in a new form,
116
447803
3239
但它們會以新的形式回來,
07:31
a form which is much more relevant
117
451066
2741
這個新的形式會和 21 世紀的
07:33
to the new technological realities of the 21st century.
118
453831
3809
新技術現實有更多更高的相關性。
07:38
In ancient times,
119
458538
1547
在古代的時候,
07:40
land was the most important asset in the world.
120
460109
4160
土地是世界上最重要的資產。
07:44
Politics, therefore, was the struggle to control land.
121
464911
3864
因此,政治都是 在努力試圖控制土地。
07:49
And dictatorship meant that all the land was owned by a single ruler
122
469141
5860
獨裁專政就表示所有的土地 都屬於單一統治者
07:55
or by a small oligarch.
123
475025
1872
或一個寡頭政治集團。
07:57
And in the modern age, machines became more important than land.
124
477827
4819
在近代,機器變得比土地更重要。
08:03
Politics became the struggle to control the machines.
125
483347
3600
政治就變成是在努力控制機器。
08:07
And dictatorship meant
126
487354
1996
而獨裁專政意味著
08:09
that too many of the machines became concentrated
127
489374
3910
太多機器都被集中
08:13
in the hands of the government or of a small elite.
128
493308
3280
在政府或少數菁英團體的手中。
08:17
Now data is replacing both land and machines
129
497109
4706
現在,資料數據取代了土地和機器,
08:21
as the most important asset.
130
501839
2111
成為最重要的資產。
08:24
Politics becomes the struggle to control the flows of data.
131
504600
5340
政治變成在努力控制資料流。
08:29
And dictatorship now means
132
509964
2708
而現在的獨裁專政意味著
08:32
that too much data is being concentrated in the hands of the government
133
512696
5780
太多數據資料集中在政府或少數
08:38
or of a small elite.
134
518500
1920
菁英團體的手中,
08:40
The greatest danger that now faces liberal democracy
135
520871
4912
現在,自由民主要面對的最大危險
08:45
is that the revolution in information technology
136
525807
3270
就是資訊科技的革命
08:49
will make dictatorships more efficient than democracies.
137
529101
4790
將會讓獨裁政權變得比民主更有效。
08:54
In the 20th century,
138
534760
1619
在 20 世紀,
08:56
democracy and capitalism defeated fascism and communism
139
536403
4960
民主和資本主義打敗了 法西斯主義和共產主義,
09:01
because democracy was better at processing data and making decisions.
140
541387
5202
因為民主比較擅長 處理數據和做決策。
09:07
Given 20th-century technology,
141
547319
2366
以 20 世紀的科技,
09:09
it was simply inefficient to try and concentrate too much data
142
549709
6035
如果要把太多數據資料 和太多權力集中在一個地方,
09:15
and too much power in one place.
143
555768
2759
實在是太沒效益了。
09:19
But it is not a law of nature
144
559244
4483
但,自然的法則並沒有說
09:23
that centralized data processing is always less efficient
145
563751
5604
集中化的數據資料處理 就一定比分散式
09:29
than distributed data processing.
146
569379
2547
更沒效益。
09:32
With the rise of artificial intelligence and machine learning,
147
572605
3437
隨著人工智慧與機器學習的興起,
09:36
it might become feasible to process enormous amounts of information
148
576066
5837
或許會可以在單一個地方
有效率地處理大量的資訊,
09:41
very efficiently in one place,
149
581927
2610
09:44
to take all the decisions in one place,
150
584561
3191
在單一個地方做所有的決策,
09:47
and then centralized data processing will be more efficient
151
587776
4420
那麼,集中式的資料處理
就會比分散式的更有效益。
09:52
than distributed data processing.
152
592220
2200
09:55
And then the main handicap of authoritarian regimes
153
595053
3760
那麼,在 20 世紀,
權力主義政體的主要不利條件——
09:58
in the 20th century --
154
598837
1637
10:00
their attempt to concentrate all the information in one place --
155
600498
4516
這些政體試圖將所有資訊 集中在一個地方——
10:05
it will become their greatest advantage.
156
605038
3366
就會變成它們最大的優勢。
10:10
Another technological danger that threatens the future of democracy
157
610768
4907
還有另一項科技危險, 會威脅到民主的未來,
10:15
is the merger of information technology with biotechnology,
158
615699
5159
那就是資訊科技和生物科技的合併,
10:20
which might result in the creation of algorithms
159
620882
4005
這可能會創造出一種演算法,
10:24
that know me better than I know myself.
160
624911
4316
能比我自己更了解我。
10:29
And once you have such algorithms,
161
629688
2080
一旦有這種演算法,
10:31
an external system, like the government,
162
631792
2849
一個外部系統,比如政府,
10:34
cannot just predict my decisions,
163
634665
3524
就不只是能預測我的決策,
10:38
it can also manipulate my feelings, my emotions.
164
638213
3921
它也能操控我的感受和情緒。
10:42
A dictator may not be able to provide me with good health care,
165
642641
5245
獨裁者也許無法提供我 好的健康照護,
10:47
but he will be able to make me love him
166
647910
3373
但他能使我愛他,
10:51
and to make me hate the opposition.
167
651307
2667
而且痛恨反對派。
10:55
Democracy will find it difficult to survive such a development
168
655037
5463
在這樣的發展下,民主很難生存,
11:00
because, in the end,
169
660524
2150
因為,到頭來,
11:02
democracy is not based on human rationality;
170
662698
4127
民主的基礎並不是人類的理性;
11:06
it's based on human feelings.
171
666849
2420
而是人類的感受。
11:10
During elections and referendums,
172
670355
2500
在選舉和公民投票中,
11:12
you're not being asked, "What do you think?"
173
672879
2696
你不會被問到:「你認為如何?」
11:15
You're actually being asked, "How do you feel?"
174
675998
3396
你會被問到:「你感覺如何?」
11:20
And if somebody can manipulate your emotions effectively,
175
680228
4627
若有人能夠有效地操控你的情緒,
11:24
democracy will become an emotional puppet show.
176
684879
3753
民主就會變成一場情緒傀儡劇。
11:30
So what can we do to prevent the return of fascism
177
690006
4515
所以,我們能做什麼, 來預防法西斯主義的再現,
11:34
and the rise of new dictatorships?
178
694545
2650
以及新獨裁主義的興起?
11:37
The number one question that we face is: Who controls the data?
179
697990
6310
我們最先面臨的問題 是誰控制著數據資料?
11:44
If you are an engineer,
180
704823
1675
如果你是工程師,
11:46
then find ways to prevent too much data
181
706522
3968
那麼就想辦法來預防太多數據
11:50
from being concentrated in too few hands.
182
710514
2825
被集中在太少數人手中。
11:53
And find ways to make sure
183
713752
2944
並想辦法確保
11:56
the distributed data processing is at least as efficient
184
716720
4413
分散式資料處理
至少要和集中式一樣有效益。
12:01
as centralized data processing.
185
721157
2396
12:04
This will be the best safeguard for democracy.
186
724077
3841
這會是民主的最佳防衛。
12:07
As for the rest of us who are not engineers,
187
727942
3326
至於不是工程師的其他人,
12:11
the number one question facing us
188
731292
3444
我們面臨的第一個問題
12:14
is how not to allow ourselves to be manipulated
189
734760
4647
就是如何不要讓我們自己被那些
控制數據資料的人給操控。
12:19
by those who control the data.
190
739431
2806
12:23
The enemies of liberal democracy, they have a method.
191
743387
4106
自由民主的敵人有一種方法,
12:27
They hack our feelings.
192
747919
2372
他們能「駭入」我們的感受中。
12:30
Not our emails, not our bank accounts --
193
750315
2452
不是駭入我們的 電子郵件或銀行帳戶,
12:32
they hack our feelings of fear and hate and vanity,
194
752791
5207
而是駭入我們的感受, 如恐懼、仇恨,和虛榮,
12:38
and then use these feelings
195
758022
2428
接著用這些感受
12:40
to polarize and destroy democracy from within.
196
760474
3952
從內部將民主給兩極化並摧毀。
12:44
This is actually a method
197
764903
1920
其實,這種方式的先驅者
12:46
that Silicon Valley pioneered in order to sell us products.
198
766847
5049
就是矽谷,他們用這種方式 把產品銷售給我們。
12:52
But now, the enemies of democracy are using this very method
199
772381
5022
但現在,民主的敵人 就是用同樣這種方式
12:57
to sell us fear and hate and vanity.
200
777427
3724
把恐懼、仇恨和虛榮銷售給我們。
13:01
They cannot create these feelings out of nothing.
201
781863
3659
他們無法無中生有創造出這些感受。
13:06
So they get to know our own preexisting weaknesses.
202
786212
3831
所以他們開始了解 我們自己本來就有的弱點。
13:10
And then use them against us.
203
790403
2571
接著用這些弱點來對付我們。
13:13
And it is therefore the responsibility of all of us
204
793482
3508
因此,我們所有人都有責任
13:17
to get to know our weaknesses
205
797014
2634
要去了解我們的弱點,
13:19
and make sure that they do not become a weapon
206
799672
3635
並確保這些弱點
不會被民主的敵人拿來當作武器。
13:23
in the hands of the enemies of democracy.
207
803331
2883
13:27
Getting to know our own weaknesses
208
807315
2548
去了解我們自己的弱點,
13:29
will also help us to avoid the trap of the fascist mirror.
209
809887
5583
也能協助我們避開 法西斯主義鏡子的陷阱。
13:36
As we explained earlier, fascism exploits our vanity.
210
816204
4151
如我們先前解釋過的, 法西斯主義會利用我們的虛榮。
13:40
It makes us see ourselves as far more beautiful than we really are.
211
820805
5746
它會讓我們認為自己 比真正的狀況還要美麗非常多。
13:46
This is the seduction.
212
826575
1467
這就是誘惑。
13:48
But if you really know yourself,
213
828424
2447
但如果你真的了解你自己,
13:50
you will not fall for this kind of flattery.
214
830895
3468
你就不會落入這種諂媚奉承。
13:54
If somebody puts a mirror in front of your eyes
215
834879
4063
如果有人把一面鏡子擺在你眼前,
13:58
that hides all your ugly bits and makes you see yourself
216
838966
5060
它把你所有醜陋的部分 隱藏起來,讓你覺得
鏡中的自己比真實的自己更漂亮、
14:04
as far more beautiful and far more important
217
844050
4284
14:08
than you really are,
218
848358
1591
更重要許多,
14:09
just break that mirror.
219
849973
2864
那就把鏡子打破。
14:13
Thank you.
220
853696
1151
謝謝。
14:14
(Applause)
221
854871
6382
(掌聲)
14:22
Chris Anderson: Yuval, thank you.
222
862677
1632
克里斯安德森:哈拉瑞,謝謝你。
14:24
Goodness me.
223
864333
1151
天哪。
14:25
It's so nice to see you again.
224
865958
1940
真高興再次見到你。
14:27
So, if I understand you right,
225
867922
1591
如果我沒誤解你的意思,
14:29
you're alerting us to two big dangers here.
226
869537
2667
你是在警告我們兩項危機。
14:32
One is the possible resurgence of a seductive form of fascism,
227
872228
4492
其一是法西斯主義的 誘惑形式有可能會再現,
14:36
but close to that, dictatorships that may not exactly be fascistic,
228
876744
4254
還有和那很相近的獨裁專政, 不見得完全是法西斯主義,
14:41
but control all the data.
229
881022
2563
但能控制所有的數據資料。
14:43
I wonder if there's a third concern
230
883609
2096
我在納悶是否有第三項議題要關心,
14:45
that some people here have already expressed,
231
885729
2118
這裡的一些人已經提出了這個議題,
14:47
which is where, not governments, but big corporations control all our data.
232
887871
4897
就是,有些大企業,而非政府, 控制了所有我們的資料。
14:52
What do you call that,
233
892792
1309
你會怎麼看它?
14:54
and how worried should we be about that?
234
894125
2387
我們對它又該有多擔心?
14:56
Yuval Noah Harari: Well, in the end, there isn't such a big difference
235
896536
3339
哈拉瑞:嗯,到頭來,企業和政府
14:59
between the corporations and the governments,
236
899899
2426
之間並沒有太大的差別,
15:02
because, as I said, the questions is: Who controls the data?
237
902349
3420
因為,如我剛說過的, 問題在於:誰控制了資料?
15:05
This is the real government.
238
905793
1395
這就是真正的政府。
15:07
If you call it a corporation or a government --
239
907212
2444
如果你稱它為企業或政府——
15:09
if it's a corporation and it really controls the data,
240
909680
3155
如果是一間企業控制了資料,
15:12
this is our real government.
241
912859
1928
它就是我們真正的政府。
15:14
So the difference is more apparent than real.
242
914811
3067
所以,這個差別是表象多於實際。
15:18
CA: But somehow, at least with corporations,
243
918529
2858
克:但就某種層面來說, 至少如果是企業的話,
15:21
you can imagine market mechanisms where they can be taken down.
244
921411
3519
你還可以想像有市場機制 來把企業拉垮。
15:24
I mean, if consumers just decide
245
924954
1917
我是指,如果消費者決定
15:26
that the company is no longer operating in their interest,
246
926895
2754
這間公司已經不是為了 消費者的利益在營運,
15:29
it does open the door to another market.
247
929673
1913
那確實就會打開 通往另一個市場的門。
15:31
It seems easier to imagine that
248
931610
1546
似乎比較容易可以想像,
15:33
than, say, citizens rising up and taking down a government
249
933180
3151
比如,公民起義
拉垮控制一切的政府。
15:36
that is in control of everything.
250
936355
1612
15:37
YNH: Well, we are not there yet,
251
937991
1555
哈:嗯,我們還沒走到那一步,
15:39
but again, if a corporation really knows you better than you know yourself --
252
939570
5158
但,如果一間企業 比你自己還了解你——
15:44
at least that it can manipulate your own deepest emotions and desires,
253
944752
5254
至少它可以操控你 最深的情緒和慾望,
15:50
and you won't even realize --
254
950030
1666
而你甚至不會發現——
15:51
you will think this is your authentic self.
255
951720
2579
你會認為這就是你最真實的自己。
15:54
So in theory, yes, in theory, you can rise against a corporation,
256
954768
4055
理論上,是的,理論上, 你可以起義對抗一間企業,
15:58
just as, in theory, you can rise against a dictatorship.
257
958847
3733
就如同,理論上, 你可以起義對抗獨裁專政。
16:02
But in practice, it is extremely difficult.
258
962982
3283
但實際上,是極度困難的。
16:07
CA: So in "Homo Deus," you argue that this would be the century
259
967281
4082
克:在《人類大命運: 從智人到神人》中,
你主張在這個世紀 人類有點變成了神,
16:11
when humans kind of became gods,
260
971387
3411
16:14
either through development of artificial intelligence
261
974822
2993
可能是透過人工智慧的發展,
16:17
or through genetic engineering.
262
977839
2929
或是透過基因工程。
16:20
Has this prospect of political system shift, collapse
263
980792
5468
這種政治體制轉換、 崩壞的預期前景
是否會衝擊你 對於那種可能性的看法?
16:26
impacted your view on that possibility?
264
986284
2664
16:29
YNH: Well, I think it makes it even more likely,
265
989553
3095
哈:嗯,我想反而可能性會更高,
16:32
and more likely that it will happen faster,
266
992672
2929
更可能會發生,且更快發生,
16:35
because in times of crisis, people are willing to take risks
267
995625
5023
因為在危機的時期,人們會願意
冒他們在其他時候不願冒的險。
16:40
that they wouldn't otherwise take.
268
1000672
1984
16:42
And people are willing to try
269
1002680
2818
人們會願意嘗試
16:45
all kinds of high-risk, high-gain technologies.
270
1005522
3873
各種高風險、高獲益的技術。
16:49
So these kinds of crises might serve the same function
271
1009760
4230
所以這些類型的危機 有可能會和 20 世紀的
兩次世界大戰有相同的功能。
16:54
as the two world wars in the 20th century.
272
1014014
3214
16:57
The two world wars greatly accelerated
273
1017252
3212
那兩次世界大戰大大加速了
17:00
the development of new and dangerous technologies.
274
1020488
3612
危險新技術的發展。
17:04
And the same thing might happen in the 21st century.
275
1024124
3284
同樣的狀況可能 會在 21 世紀發生。
17:07
I mean, you need to be a little crazy to run too fast,
276
1027721
3897
我是指,你得要 有點瘋狂才能跑太快,
17:11
let's say, with genetic engineering.
277
1031642
2325
比如在基因工程方面。
17:13
But now you have more and more crazy people
278
1033991
3110
但現在有越來越多瘋狂的人
17:17
in charge of different countries in the world,
279
1037125
2681
主導世界上的不同國家,
17:19
so the chances are getting higher, not lower.
280
1039830
3067
所以可能性反而會更高,而非更低。
17:23
CA: So, putting it all together, Yuval, you've got this unique vision.
281
1043647
3365
克:所以,總的來說, 哈拉瑞,你有非常獨特的遠景。
17:27
Roll the clock forward 30 years.
282
1047036
1587
把時間向未來快轉 30 年。
17:28
What's your guess -- does humanity just somehow scrape through,
283
1048647
3263
你的猜測是什麼? 人類是否會以某種方式勉強渡過,
17:31
look back and say, "Wow, that was a close thing. We did it!"
284
1051934
3135
回頭看,並說:「哇, 差一點就失敗,但我們成功了!」
17:35
Or not?
285
1055093
1436
或者不會?
17:36
YNH: So far, we've managed to overcome all the previous crises.
286
1056553
3848
哈:目前,我們都有辦法 克服過去的所有危機。
17:40
And especially if you look at liberal democracy
287
1060425
2819
特別是,如果你去看自由民主,
17:43
and you think things are bad now,
288
1063268
2944
你會認為現在狀況不好,
17:46
just remember how much worse things looked in 1938 or in 1968.
289
1066236
6466
別忘了在 1938 或 1968 年時 狀況有多糟。
17:52
So this is really nothing, this is just a small crisis.
290
1072726
2991
這其實不算什麼,只是個小危機。
17:56
But you can never know,
291
1076167
1969
但你永遠不會知道,
17:58
because, as a historian,
292
1078160
2131
因為,身為歷史學家,
18:00
I know that you should never underestimate human stupidity.
293
1080315
4774
我知道永遠都不要 低估了人類的愚蠢。
18:05
(Laughter) (Applause)
294
1085113
1155
(笑聲)(掌聲)
18:06
It is one of the most powerful forces that shape history.
295
1086292
4004
那是形成歷史最強大的力量之一。
18:11
CA: Yuval, it's been an absolute delight to have you with us.
296
1091433
2865
克:哈拉瑞,非常榮幸 能請你來與我們分享。
18:14
Thank you for making the virtual trip.
297
1094322
1817
謝謝你透過虛擬旅程來到現場。
18:16
Have a great evening there in Tel Aviv.
298
1096163
1873
祝你在特拉維夫有個美好的夜晚。
18:18
Yuval Harari!
299
1098060
1151
哈拉瑞!
18:19
YNH: Thank you very much.
300
1099235
1376
哈:非常謝謝。
18:20
(Applause)
301
1100635
1150
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7