How to be "Team Human" in the digital future | Douglas Rushkoff

120,262 views ・ 2019-01-14

TED


請雙擊下方英文字幕播放視頻。

譯者: Lilian Chiu 審譯者: 易帆 余
00:13
I got invited to an exclusive resort
0
13520
3880
我曾應邀到一個高檔渡假村,
00:17
to deliver a talk about the digital future
1
17440
2456
我以為是要向幾百位科技主管闡述
00:19
to what I assumed would be a couple of hundred tech executives.
2
19920
3576
我對數位未來的看法,
00:23
And I was there in the green room, waiting to go on,
3
23520
2696
但我到了那裡,卻是在一間 綠色房間中,等後演講的開始,
00:26
and instead of bringing me to the stage, they brought five men into the green room
4
26240
5176
他們並沒有把我帶上舞台,
而是把五個人帶進那間綠色房間,
00:31
who sat around this little table with me.
5
31440
2056
他們和我圍著一張小圓桌坐下。
00:33
They were tech billionaires.
6
33520
2096
這些人都是科技界的億萬富翁。
00:35
And they started peppering me with these really binary questions,
7
35640
4536
他們開始連珠炮似地問些二元問題,
00:40
like: Bitcoin or Etherium?
8
40200
2000
比如:比特幣還是以太幣?
00:43
Virtual reality or augmented reality?
9
43120
2656
AR 虛擬實境或 VR 擴增實境?
00:45
I don't know if they were taking bets or what.
10
45800
2496
我不知道他們是在打賭還是怎樣。
00:48
And as they got more comfortable with me,
11
48320
2816
當他們和我相處比較自在了,
00:51
they edged towards their real question of concern.
12
51160
3216
他們才慢慢開始問出 他們真正在意的問題。
00:54
Alaska or New Zealand?
13
54400
2240
(當末日來臨時) 要選阿拉斯加或紐西蘭?
00:57
That's right.
14
57760
1216
沒錯。
00:59
These tech billionaires were asking a media theorist for advice
15
59000
2976
這些科技界的億萬富翁 向一位媒體理論家詢求建議,
01:02
on where to put their doomsday bunkers.
16
62000
1880
問我躲避世界末日的 地下碉堡要建在哪裡。
01:04
We spent the rest of the hour on the single question:
17
64600
3136
接下來的一個小時, 我們只討論這一個問題:
01:07
"How do I maintain control of my security staff
18
67760
3616
「在事件發生之後,我要如何能
繼續掌控我的保全人員?」
01:11
after the event?"
19
71400
1320
01:13
By "the event" they mean the thermonuclear war
20
73920
2736
他們所謂的「事件」指的就是核戰
01:16
or climate catastrophe or social unrest that ends the world as we know it,
21
76680
4656
或是氣候災難,或是社會動盪, 這類會毀滅我們世界的事件,
01:21
and more importantly, makes their money obsolete.
22
81360
3280
他們更擔心的是,當末日來臨時, 他們的錢就毫無用處了。
01:26
And I couldn't help but think:
23
86200
2216
我忍不住心想:
01:28
these are the wealthiest, most powerful men in the world,
24
88440
4736
這些人可都是當今世界 最富有、最有權勢的人,
01:33
yet they see themselves as utterly powerless to influence the future.
25
93200
4656
然而,他們卻認為自己 對末日來臨束手無策。
01:37
The best they can do is hang on for the inevitable catastrophe
26
97880
4456
他們能做的,只想撐到末日來臨時,
01:42
and then use their technology and money to get away from the rest of us.
27
102360
3680
用他們的科技和金錢 來遠離我們這些等閒之輩。
01:47
And these are the winners of the digital economy.
28
107520
2536
數位經濟時代的贏家們 竟是這樣的嘴臉。
01:50
(Laughter)
29
110080
3416
(笑聲)
01:53
The digital renaissance
30
113520
2776
數位文藝復興
01:56
was about the unbridled potential
31
116320
4256
談的是如何把人類集體想像力
02:00
of the collective human imagination.
32
120600
2416
發揮到極致。
02:03
It spanned everything from chaos math and quantum physics
33
123040
5136
它橫跨所有事物, 從渾沌數學和量子物理,
02:08
to fantasy role-playing and the Gaia hypothesis, right?
34
128200
4176
到奇幻角色扮演 以及蓋亞假說,對吧?
02:12
We believed that human beings connected could create any future we could imagine.
35
132400
6720
我們曾相信,人類聯合起來
就能創造出我們想像的未來。
02:20
And then came the dot com boom.
36
140840
2199
接著,「.com」熱潮出現了。
02:24
And the digital future became stock futures.
37
144600
3616
數位未來變成了股票期貨。
02:28
And we used all that energy of the digital age
38
148240
3016
我們耗盡了數位時代的所有能量,
02:31
to pump steroids into the already dying NASDAQ stock exchange.
39
151280
4256
為已經垂死的 那斯達克證券交易所注入強心針。
02:35
The tech magazines told us a tsunami was coming.
40
155560
3616
科技雜誌告訴我們,海嘯要來了。
02:39
And only the investors who hired the best scenario-planners and futurists
41
159200
4456
唯有僱用最好的情境規劃者 和未來主義者的投資人
02:43
would be able to survive the wave.
42
163680
2520
才能在這波浪潮後生存下來,
02:47
And so the future changed from this thing we create together in the present
43
167160
5896
而未來,從我們現在 創造出來的這波浪潮
到我們所賭上的一切,
02:53
to something we bet on
44
173080
1496
02:54
in some kind of a zero-sum winner-takes-all competition.
45
174600
3200
似乎變成了某種贏家全拿的零和遊戲。
03:00
And when things get that competitive about the future,
46
180120
3136
當未來的競爭變得如此激烈時,
03:03
humans are no longer valued for our creativity.
47
183280
3296
人類則不再因為 擁有創造力而受到重視。
03:06
No, now we're just valued for our data.
48
186600
3136
不,我們的價值只剩下數據。
03:09
Because they can use the data to make predictions.
49
189760
2376
因為他們可以用我們的數據來做預測。
03:12
Creativity, if anything, that creates noise.
50
192160
2576
創造力,如果有的話,只會產生干擾。
03:14
That makes it harder to predict.
51
194760
2216
會讓預測變更困難。
03:17
So we ended up with a digital landscape
52
197000
2416
所以,我們最後得到的 就只是一個數位藍圖,
03:19
that really repressed creativity, repressed novelty,
53
199440
3256
這的確會壓制我們的創造力、創新性,
03:22
it repressed what makes us most human.
54
202720
2840
會壓制讓我們之所以 成為人類的、最根本的東西。
03:26
We ended up with social media.
55
206760
1456
這些社交媒體真的有以新的、 有趣的方式將人類聯繫在一起嗎?
03:28
Does social media really connect people in new, interesting ways?
56
208240
3456
03:31
No, social media is about using our data to predict our future behavior.
57
211720
5016
不,社交媒體只會利用我們的數據 來預測我們的未來行為。
03:36
Or when necessary, to influence our future behavior
58
216760
2896
或者,在必要時, 影響我們未來的行為,
03:39
so that we act more in accordance with our statistical profiles.
59
219680
4040
讓我們的所做所為 更符合他們的統計輪廓。
03:45
The digital economy -- does it like people?
60
225200
2216
數位經濟——它能像人一樣嗎?
03:47
No, if you have a business plan, what are you supposed to do?
61
227440
2896
不,若你有個商業計畫, 你該怎麼做?
把所有的人都擺脫掉。
03:50
Get rid of all the people.
62
230360
1256
03:51
Human beings, they want health care, they want money, they want meaning.
63
231640
3400
人類要的是健康照護, 他們想要錢,他們想要意義。
03:56
You can't scale with people.
64
236360
1880
你無法把「人」規模化的。
03:59
(Laughter)
65
239360
1456
(笑聲)
04:00
Even our digital apps --
66
240840
1216
即使我們的數位應用程式——
04:02
they don't help us form any rapport or solidarity.
67
242080
3216
它們不會協助我們形成 任何密切關係或團結。
04:05
I mean, where's the button on the ride hailing app
68
245320
2416
我是指,在叫車服務應用程式上,
04:07
for the drivers to talk to one another about their working conditions
69
247760
3496
並沒有一個按鈕 能讓司機彼此交談,
談論他們的工作狀況, 或組織工會吧?
04:11
or to unionize?
70
251280
1200
04:13
Even our videoconferencing tools,
71
253600
2016
就算是我們的視訊會議工具,
04:15
they don't allow us to establish real rapport.
72
255640
2376
它們也不能讓我們 建立起真正的密切關係。
04:18
However good the resolution of the video,
73
258040
3336
不論影像的解析度有多好,
04:21
you still can't see if somebody's irises are opening to really take you in.
74
261400
4016
你仍然無法看到對方的虹膜 是否有打開,真正的關心你。
04:25
All of the things that we've done to establish rapport
75
265440
2576
人類在建立親密關係上做了很多演化,
04:28
that we've developed over hundreds of thousands of years of evolution,
76
268040
3335
這些都是我們數十萬年演化來的,
04:31
they don't work,
77
271399
1217
但根本用不上,你無法看到 對方的呼吸是否與你同步。
04:32
you can't see if someone's breath is syncing up with yours.
78
272640
3016
04:35
So the mirror neurons never fire, the oxytocin never goes through your body,
79
275680
3656
所以鏡像神經元無法興奮起來, 體內也不會有讓人變得親密的催產素,
04:39
you never have that experience of bonding with the other human being.
80
279360
3240
你永遠也不會有與他人 同舟共濟的那種經歷。
04:43
And instead, you're left like,
81
283360
1456
反之,你剩下的只有:
04:44
"Well, they agreed with me, but did they really,
82
284840
2256
「嗯,他們同意我, 但他們真的懂我嗎?」
04:47
did they really get me?"
83
287120
1696
04:48
And we don't blame the technology for that lack of fidelity.
84
288840
3376
而且我們不會去怪罪 造成交流失真的科技。
04:52
We blame the other person.
85
292240
1480
反而只會責怪對方。
04:55
You know, even the technologies and the digital initiatives that we have
86
295320
4056
各位知道嗎,甚至我們現在擁有
用來促進人類情感交流的 科技和主動式數位服務,
04:59
to promote humans,
87
299400
2176
05:01
are intensely anti-human at the core.
88
301600
2760
其實它們的核心本質 也是非常反人類的。
05:05
Think about the blockchain.
89
305600
2000
就拿區塊鏈來說吧。
05:08
The blockchain is here to help us have a great humanized economy? No.
90
308520
3616
區塊鏈的目的是要協助我們 有很棒的人性化經濟嗎?不!
05:12
The blockchain does not engender trust between users,
91
312160
2696
區塊鏈並不能在用戶之間建立信任,
05:14
the blockchain simply substitutes for trust in a new,
92
314880
3536
區塊鏈只是用一種新的、
更不透明的方式 取代了用戶間的信任,
05:18
even less transparent way.
93
318440
2000
05:21
Or the code movement.
94
321600
1816
或是程式碼運動。
05:23
I mean, education is great, we love education,
95
323440
2176
的確,教育很棒,我們熱愛教育,
05:25
and it's a wonderful idea
96
325640
1456
它是個很美好的想法,
05:27
that we want kids to be able to get jobs in the digital future,
97
327120
3136
因為我們希望孩子 能在數位未來找到工作,
05:30
so we'll teach them code now.
98
330280
1600
我們現在就教他們寫程式碼。
05:32
But since when is education about getting jobs?
99
332640
2480
但從何時開始,教育的目的 變成是要找工作?
05:35
Education wasn't about getting jobs.
100
335920
1736
教育不是為了找工作。
05:37
Education was compensation for a job well done.
101
337680
3976
教育是對出色工作的補償。
05:41
The idea of public education
102
341680
1576
公眾教育的初衷,談的是
05:43
was for coal miners, who would work in the coal mines all day,
103
343280
3376
整天在礦山工作的煤礦工人們,
05:46
then they'd come home and they should have the dignity
104
346680
2576
在他們下班回到家後, 能為自己讀本小說
05:49
to be able to read a novel and understand it.
105
349280
2136
並了解內容而感到自豪。
05:51
Or the intelligence to be able to participate in democracy.
106
351440
2960
或是擁有能夠參與民主的智慧。
05:55
When we make it an extension of the job, what are we really doing?
107
355200
3216
當我們把教育變成是 工作的延伸,這是在幹什麼?
05:58
We're just letting corporations really
108
358440
2536
我們只是讓企業把訓練員工的成本
06:01
externalize the cost of training their workers.
109
361000
3120
給外部化而已。
06:05
And the worst of all really is the humane technology movement.
110
365520
4096
而最糟糕的是人道科技運動。
06:09
I mean, I love these guys, the former guys who used to take
111
369640
2816
我很愛這些傢伙,他們以前會
06:12
the algorithms from Las Vegas slot machines
112
372480
2936
拿拉斯維加斯 吃角子老虎機的演算法
06:15
and put them in our social media feed so that we get addicted.
113
375440
2936
放在我們的社交媒體上, 好讓我們上癮。
06:18
Now they've seen the error of their ways
114
378400
1936
現在,他們已經知道 他們的方式錯了,
06:20
and they want to make technology more humane.
115
380360
2456
所以他們想要讓科技變得更人道化。
06:22
But when I hear the expression "humane technology,"
116
382840
2416
但,當我聽到 「人道科技」這個說法時,
06:25
I think about cage-free chickens or something.
117
385280
2856
我想到的是放山雞之類的東西。
06:28
We're going to be as humane as possible to them,
118
388160
2256
就像我們把雞送進屠宰場以前,
06:30
until we take them to the slaughter.
119
390440
1720
會盡可能對雞人道一點。
06:33
So now they're going to let these technologies be as humane as possible,
120
393200
3416
所以,他們現在要讓這些科技 盡可能人道一點,
06:36
as long as they extract enough data and extract enough money from us
121
396640
3216
以便從我們身上能源源不斷地 提取足夠的數據和金錢
06:39
to please their shareholders.
122
399880
1400
來取悅他們的股東。
06:42
Meanwhile, the shareholders, for their part, they're just thinking,
123
402520
3176
同時,就股東方面來說, 他們只是在想:
06:45
"I need to earn enough money now, so I can insulate myself
124
405720
2976
「我現在需要透過這樣的方式 來賺取足夠的錢,
06:48
from the world I'm creating by earning money in this way."
125
408720
3056
以便我將來能與自己所創的 數位世界隔離開來。」
06:51
(Laughter)
126
411800
2376
(笑聲)
06:54
No matter how many VR goggles they slap on their faces
127
414200
4056
不論他們在自己臉上戴 多少副虛擬實境眼鏡,
06:58
and whatever fantasy world they go into,
128
418280
2256
不論他們進入哪一個奇幻世界,
07:00
they can't externalize the slavery and pollution that was caused
129
420560
3536
他們都不能把製造這個裝置
所造成的奴役和污染給外部化。
07:04
through the manufacture of the very device.
130
424120
2976
07:07
It reminds me of Thomas Jefferson's dumbwaiter.
131
427120
3176
這讓我想起湯瑪斯·傑佛遜的上菜機。
07:10
Now, we like to think that he made the dumbwaiter
132
430320
2336
我們傾向於認為傑佛遜製造上菜機
07:12
in order to spare his slaves all that labor of carrying the food
133
432680
3656
是為了免去他的奴隸
把食物送到餐廳 供人享用的所有勞動。
07:16
up to the dining room for the people to eat.
134
436360
2776
07:19
That's not what it was for, it wasn't for the slaves,
135
439160
2496
但實情並非如此, 那不是為了奴隸們,
07:21
it was for Thomas Jefferson and his dinner guests,
136
441680
2336
那是為了湯瑪斯·傑佛遜 和他的晚餐賓客,
這麼一來他們就不用看見 奴隸把食物送上桌。
07:24
so they didn't have to see the slave bringing the food up.
137
444040
3096
07:27
The food just arrived magically,
138
447160
1576
食物會自己神奇地上桌,
07:28
like it was coming out of a "Start Trek" replicator.
139
448760
3000
好像是從《星艦迷航記》的 複製機生產出來的。
07:32
It's part of an ethos that says,
140
452720
2096
這代表了一種社會思潮:
07:34
human beings are the problem and technology is the solution.
141
454840
4280
認為人是問題的根本, 科技才是解決方案。
07:40
We can't think that way anymore.
142
460680
2056
我們不能再那樣想了。
07:42
We have to stop using technology to optimize human beings for the market
143
462760
5296
我們必續停止為迎合市場 而使用科技來優化人類,
07:48
and start optimizing technology for the human future.
144
468080
5040
而要開始為人類的未來,將科技優化。
07:55
But that's a really hard argument to make these days,
145
475080
2656
但現今,很難建立這樣的共識,
07:57
because humans are not popular beings.
146
477760
4056
因為人類不太受歡迎。
08:01
I talked about this in front of an environmentalist just the other day,
147
481840
3376
不久前,我在一位環保人士 面前談到這件事,
她說:「你為什麼要 為人類辯護?人類摧毀了地球。
08:05
and she said, "Why are you defending humans?
148
485240
2096
08:07
Humans destroyed the planet. They deserve to go extinct."
149
487360
2696
他們本來就活該滅絕。」
08:10
(Laughter)
150
490080
3456
(笑聲)
08:13
Even our popular media hates humans.
151
493560
2576
連我們的大眾媒體也討厭人類。
08:16
Watch television,
152
496160
1256
看看電視就知道,
08:17
all the sci-fi shows are about how robots are better and nicer than people.
153
497440
3736
所有科幻節目都在講 機器人比人類好、比人類優秀。
08:21
Even zombie shows -- what is every zombie show about?
154
501200
2976
就連殭屍節目也是—— 殭屍節目的內容是什麼?
08:24
Some person, looking at the horizon at some zombie going by,
155
504200
3256
有人看著地平線上的殭屍路過,
08:27
and they zoom in on the person and you see the person's face,
156
507480
2896
鏡頭拉近特寫, 你可以看到那個人的臉,
08:30
and you know what they're thinking:
157
510400
1736
你會知道他在想什麼:
「殭屍和我之間有什麼差別?
08:32
"What's really the difference between that zombie and me?
158
512160
2736
08:34
He walks, I walk.
159
514920
1536
他會行走,我也會行走。
08:36
He eats, I eat.
160
516480
2016
他會進食,我也會進食。
08:38
He kills, I kill."
161
518520
2280
他在殺戮,我也在殺戮。」
08:42
But he's a zombie.
162
522360
1536
但他就是個殭屍而已啊, 至少你還能意識到這一點。
08:43
At least you're aware of it.
163
523920
1416
08:45
If we are actually having trouble distinguishing ourselves from zombies,
164
525360
3696
如果我們連區分自己 和殭屍的能力都沒有,
08:49
we have a pretty big problem going on.
165
529080
2176
那我們的問題其實蠻大的。
08:51
(Laughter)
166
531280
1216
(笑聲)
08:52
And don't even get me started on the transhumanists.
167
532520
2936
更別讓我談及超人類主義者, 那會沒完沒了。
08:55
I was on a panel with a transhumanist, and he's going on about the singularity.
168
535480
3816
曾與我在同一座談小組的一位 超人類主義者是這樣談論「奇點」:
08:59
"Oh, the day is going to come really soon when computers are smarter than people.
169
539320
3856
「喔,電腦比人更聰明的 那一天很快就會到來了。
09:03
And the only option for people at that point
170
543200
2136
到那時,人類唯一的選項
09:05
is to pass the evolutionary torch to our successor
171
545360
3336
就是把演化的火把 交給我們的繼任者,
09:08
and fade into the background.
172
548720
1616
然後從背景慢慢淡出。
09:10
Maybe at best, upload your consciousness to a silicon chip.
173
550360
3456
也許最好的結局就是把你的意識 上傳到一片矽晶片中。
09:13
And accept your extinction."
174
553840
1840
而你還必須接受你已滅絕的事實。」
09:16
(Laughter)
175
556640
1456
(笑聲)
09:18
And I said, "No, human beings are special.
176
558120
3376
我說:「不,人類是特別的。
09:21
We can embrace ambiguity, we understand paradox,
177
561520
3536
我們能接受歧異,理解彼此的論點,
09:25
we're conscious, we're weird, we're quirky.
178
565080
2616
我們有意識, 我們很怪咖,也很古靈精怪。
09:27
There should be a place for humans in the digital future."
179
567720
3336
在數位未來,應該會有 人類的容身之處。」
09:31
And he said, "Oh, Rushkoff,
180
571080
1536
他說:「喔,拉許柯夫, 你會那麼說,是因為你是人類。」
09:32
you're just saying that because you're a human."
181
572640
2296
09:34
(Laughter)
182
574960
1776
(笑聲)
09:36
As if it's hubris.
183
576760
1600
且說得很傲慢自大。
09:39
OK, I'm on "Team Human."
184
579280
2800
沒錯,我是「人類戰隊」的一員。
09:43
That was the original insight of the digital age.
185
583200
3856
那是數位時代的初衷洞見。
09:47
That being human is a team sport,
186
587080
2216
身為人類是一種團隊運動,
09:49
evolution's a collaborative act.
187
589320
2736
進化是一種集體協作的行為。
09:52
Even the trees in the forest,
188
592080
1416
就連森林中的樹木, 也並非全都相互競爭,
09:53
they're not all in competition with each other,
189
593520
2216
09:55
they're connected with the vast network of roots and mushrooms
190
595760
3216
它們與巨大盤根錯節的 根系及菌類緊密連接著,
09:59
that let them communicate with one another and pass nutrients back and forth.
191
599000
4536
讓它們能彼此溝通,彼此傳遞養份。
10:03
If human beings are the most evolved species,
192
603560
2136
如果人類是最高度演化的物種,
10:05
it's because we have the most evolved ways of collaborating and communicating.
193
605720
4136
那是因為我們有最高度演化的 合作與溝通方式。
10:09
We have language.
194
609880
1496
我們有語言。
10:11
We have technology.
195
611400
1240
我們有科技。
10:14
It's funny, I used to be the guy who talked about the digital future
196
614120
4576
很有趣,我以前扮演的角色 都是在談數位未來,
10:18
for people who hadn't yet experienced anything digital.
197
618720
2680
說給尚未體驗過任何數位的人聽。
10:22
And now I feel like I'm the last guy
198
622200
1816
現在我覺得我是最後一個
10:24
who remembers what life was like before digital technology.
199
624040
2920
還記得在數位科技出現 之前的生活的人。
10:28
It's not a matter of rejecting the digital or rejecting the technological.
200
628680
4216
不是要我們排斥數位科技的東西,
10:32
It's a matter of retrieving the values that we're in danger of leaving behind
201
632920
4096
而是要找回瀕臨被我們拋棄的價值,
10:37
and then embedding them in the digital infrastructure for the future.
202
637040
3600
並把這些價值嵌入到 未來的數位基礎建設中。
10:41
And that's not rocket science.
203
641880
2296
那並不是多難的科學。
10:44
It's as simple as making a social network
204
644200
2096
它就和建立社交網路一樣簡單,
10:46
that instead of teaching us to see people as adversaries,
205
646320
3536
它不是教導我們把人視為對手,
10:49
it teaches us to see our adversaries as people.
206
649880
3000
而是教導我們把對手視為人。
10:54
It means creating an economy that doesn't favor a platform monopoly
207
654240
4296
這意味著,我們創造的經濟體不應該
只是個期望從人身上 攫取價值的壟斷平台,
10:58
that wants to extract all the value out of people and places,
208
658560
3336
11:01
but one that promotes the circulation of value through a community
209
661920
3936
而應是一個透過社群, 促進價值流通的經濟體,
11:05
and allows us to establish platform cooperatives
210
665880
2416
能讓我們建立起
11:08
that distribute ownership as wide as possible.
211
668320
3816
盡可能廣泛分權的合作平台。
11:12
It means building platforms
212
672160
1656
這意味著,建立出來的平台
11:13
that don't repress our creativity and novelty in the name of prediction
213
673840
4656
並不會以預測之名來抑制 我們創意和新鮮奇性,
11:18
but actually promote creativity and novelty,
214
678520
2576
而是真正地促進創造力和創新性,
11:21
so that we can come up with some of the solutions
215
681120
2336
這樣我們才能想出一些解決方案
11:23
to actually get ourselves out of the mess that we're in.
216
683480
2680
來真正將我們從 現在的困境中解救出來。
11:27
No, instead of trying to earn enough money to insulate ourselves
217
687440
3056
不,與其盡全力賺取足夠的錢
11:30
from the world we're creating,
218
690520
1496
以脫離我們正在創造的世界,
11:32
why don't we spend that time and energy making the world a place
219
692040
3056
不如把這些時間和精力 放在把世界變成
11:35
that we don't feel the need to escape from.
220
695120
2040
一個根本不需要逃離的地方?
11:38
There is no escape, there is only one thing going on here.
221
698000
3200
完全不用逃離,在這裡 只有一件事情會發生。
11:42
Please, don't leave.
222
702680
2120
請不要離開。
11:45
Join us.
223
705640
1360
加入我們。
11:47
We may not be perfect,
224
707800
1296
我們可能並不完美,
11:49
but whatever happens, at least you won't be alone.
225
709120
2440
但不論發生什麼事, 至少你都不孤單。
11:52
Join "Team Human."
226
712640
1520
加入「人類戰隊」吧。
11:55
Find the others.
227
715160
2096
團結其他人。
11:57
Together, let's make the future that we always wanted.
228
717280
2960
讓我們攜手共創大家所憧憬的未來。
12:01
Oh, and those tech billionaires who wanted to know
229
721560
2496
喔,還有那些科技億萬富翁們,
12:04
how to maintain control of their security force after the apocalypse,
230
724080
3256
想知道在末日來臨時 如何繼續控制他們的安全人員,
12:07
you know what I told them?
231
727360
1896
各位知道我怎麼回答他們嗎?
12:09
"Start treating those people with love and respect right now.
232
729280
4216
「現在就開始 用愛和尊重來對待平常人。
12:13
Maybe you won't have an apocalypse to worry about."
233
733520
2440
也許令你憂心的世界末日就不會來臨。」
12:16
Thank you.
234
736840
1216
謝謝。
12:18
(Applause)
235
738080
4440
(掌聲)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7


This website was created in October 2020 and last updated on June 12, 2025.

It is now archived and preserved as an English learning resource.

Some information may be out of date.

隱私政策

eng.lish.video

Developer's Blog