How to be "Team Human" in the digital future | Douglas Rushkoff

117,366 views ・ 2019-01-14

TED


请双击下面的英文字幕来播放视频。

翻译人员: Carol Wang 校对人员: Fandi Yi
00:13
I got invited to an exclusive resort
0
13520
3880
我曾应邀到一个高端度假村,
00:17
to deliver a talk about the digital future
1
17440
2456
讲一下关于人类的数字未来,
00:19
to what I assumed would be a couple of hundred tech executives.
2
19920
3576
去之前,我以为听众 是几百位科技高管。
00:23
And I was there in the green room, waiting to go on,
3
23520
2696
我在休息室,等待演讲的开始,
00:26
and instead of bringing me to the stage, they brought five men into the green room
4
26240
5176
然而,他们并没把我领到台上, 而是领着五个人来到休息间,
00:31
who sat around this little table with me.
5
31440
2056
和我一起坐在这样的小桌旁。
00:33
They were tech billionaires.
6
33520
2096
这些人都是科技界的亿万富翁!
00:35
And they started peppering me with these really binary questions,
7
35640
4536
他们开始连珠炮似地问些二元问题,
00:40
like: Bitcoin or Etherium?
8
40200
2000
比如:比特币,还是以太币?
00:43
Virtual reality or augmented reality?
9
43120
2656
虚拟现实VR,还是增强现实AR呢?
00:45
I don't know if they were taking bets or what.
10
45800
2496
我不知道他们是在打赌 还是有其他目的。
00:48
And as they got more comfortable with me,
11
48320
2816
当他们和我熟络了以后,
00:51
they edged towards their real question of concern.
12
51160
3216
就试探着问起了真正的关心问题:
00:54
Alaska or New Zealand?
13
54400
2240
(当世界末日来临时,) 去阿拉斯加还是新西兰?
00:57
That's right.
14
57760
1216
一点没错,
这些科技界的亿万富翁 在寻求一个媒体理论家的建议:
00:59
These tech billionaires were asking a media theorist for advice
15
59000
2976
01:02
on where to put their doomsday bunkers.
16
62000
1880
世界末日来临时, 去哪里才可以避祸。
01:04
We spent the rest of the hour on the single question:
17
64600
3136
在一小时剩下的时间里, 我们全聚焦在一个问题上:
01:07
"How do I maintain control of my security staff
18
67760
3616
“事件发生以后,
01:11
after the event?"
19
71400
1320
我如何保持对安保人员的控制?”
01:13
By "the event" they mean the thermonuclear war
20
73920
2736
他们所说的 “事件”, 指的就是大家所知的
01:16
or climate catastrophe or social unrest that ends the world as we know it,
21
76680
4656
由热核战争、气候灾难 或社会动荡带来的世界末日,
01:21
and more importantly, makes their money obsolete.
22
81360
3280
更重要的是,世界末日 让他们的钱毫无用处了。
01:26
And I couldn't help but think:
23
86200
2216
但我不禁想到:
01:28
these are the wealthiest, most powerful men in the world,
24
88440
4736
这些人可都是当今世界 最富有、最有权势的人,
01:33
yet they see themselves as utterly powerless to influence the future.
25
93200
4656
然而,他们却认为自己 对末日来临束手无策。
01:37
The best they can do is hang on for the inevitable catastrophe
26
97880
4456
他们能做的,就是 直面在劫难逃的命运,
01:42
and then use their technology and money to get away from the rest of us.
27
102360
3680
然后用他们的技术和金钱, 远离我们这些等闲之辈。
01:47
And these are the winners of the digital economy.
28
107520
2536
数字经济时代的赢家们 竟然是这样的嘴脸!
01:50
(Laughter)
29
110080
3416
(笑声)
01:53
The digital renaissance
30
113520
2776
数字复兴,
01:56
was about the unbridled potential
31
116320
4256
把人类集体想像力
02:00
of the collective human imagination.
32
120600
2416
发挥到了极致,
02:03
It spanned everything from chaos math and quantum physics
33
123040
5136
它涵盖了从混沌数学、量子物理,
02:08
to fantasy role-playing and the Gaia hypothesis, right?
34
128200
4176
到奇幻角色扮演和盖亚 假说的方方面面, 对吧?
02:12
We believed that human beings connected could create any future we could imagine.
35
132400
6720
我们曾相信,人类联合起来 就可创造出我们想像的未来。
02:20
And then came the dot com boom.
36
140840
2199
接下来,互联网泡沫时代来临了,
02:24
And the digital future became stock futures.
37
144600
3616
而所谓的数字未来, 则变成了股票未来。
02:28
And we used all that energy of the digital age
38
148240
3016
我们竭尽数字时代的所有能量,
02:31
to pump steroids into the already dying NASDAQ stock exchange.
39
151280
4256
为垂死的纳斯达克证券交易所 注入了一剂强心针。
02:35
The tech magazines told us a tsunami was coming.
40
155560
3616
科技杂志告诉我们, 金融海啸就要来临。
02:39
And only the investors who hired the best scenario-planners and futurists
41
159200
4456
只有聘请了顶级全局策划者 和未来学家的投资者,
02:43
would be able to survive the wave.
42
163680
2520
才能在这波浪潮中存活下来。
02:47
And so the future changed from this thing we create together in the present
43
167160
5896
因此,未来从我们当下 共同创造的这个时代怪胎,
02:53
to something we bet on
44
173080
1496
变为竞争中我们的押注,
02:54
in some kind of a zero-sum winner-takes-all competition.
45
174600
3200
这种类似于“零和”的, “胜者拥有一切”的竞争。
03:00
And when things get that competitive about the future,
46
180120
3136
当未来的竞争变得如此激烈时,
03:03
humans are no longer valued for our creativity.
47
183280
3296
人类则不再因为拥有 创造力而受到重视。
03:06
No, now we're just valued for our data.
48
186600
3136
不, 现在我们受重视 只因我们的数据而已,
03:09
Because they can use the data to make predictions.
49
189760
2376
因为他们可以使用 我们的数据进行预测。
03:12
Creativity, if anything, that creates noise.
50
192160
2576
创造力,如果有的话, 只会产生不和谐因素,
03:14
That makes it harder to predict.
51
194760
2216
使预测变得更加困难。
03:17
So we ended up with a digital landscape
52
197000
2416
所以我们最终得到的, 是一个数字蓝图,
03:19
that really repressed creativity, repressed novelty,
53
199440
3256
它真正压制了创造力、 压制了创新性,
03:22
it repressed what makes us most human.
54
202720
2840
它恰恰压制了让我们之所以 成为人类的、最根本的东西。
03:26
We ended up with social media.
55
206760
1456
我们最终得到是社交媒体。
03:28
Does social media really connect people in new, interesting ways?
56
208240
3456
社交媒体果真以新的、有趣的 方式将人们联系在一起了吗?
03:31
No, social media is about using our data to predict our future behavior.
57
211720
5016
没有! 社交媒体只是用我们的 数据,来预测我们未来的行为。
03:36
Or when necessary, to influence our future behavior
58
216760
2896
或者,在必要时, 影响我们未来的行为,
03:39
so that we act more in accordance with our statistical profiles.
59
219680
4040
以便我们更多地按照 我们的统计概况行事。
03:45
The digital economy -- does it like people?
60
225200
2216
数字经济——它喜欢人类吗?
03:47
No, if you have a business plan, what are you supposed to do?
61
227440
2896
不!如果你有商业计划, 你一般会怎么实施?
远离所有人!
03:50
Get rid of all the people.
62
230360
1256
03:51
Human beings, they want health care, they want money, they want meaning.
63
231640
3400
人类,他们想要医疗、想要钱, 还想要存在的意义。
03:56
You can't scale with people.
64
236360
1880
如果雇佣人类的话, 公司规模不可能做大。
(笑声)
03:59
(Laughter)
65
239360
1456
04:00
Even our digital apps --
66
240840
1216
即便是我们的数字应用——
04:02
they don't help us form any rapport or solidarity.
67
242080
3216
它们也无助于我们形成 任何融洽或团结的关系。
04:05
I mean, where's the button on the ride hailing app
68
245320
2416
我是说,打(租)车软件上有哪个按钮,
04:07
for the drivers to talk to one another about their working conditions
69
247760
3496
可以让司机们交流各自的工作环境、
04:11
or to unionize?
70
251280
1200
或一起组建工会?
04:13
Even our videoconferencing tools,
71
253600
2016
即使是视频会议工具,
04:15
they don't allow us to establish real rapport.
72
255640
2376
它们也不允许我们 建立真正的融洽关系。
04:18
However good the resolution of the video,
73
258040
3336
无论视频的分辨率多高,
04:21
you still can't see if somebody's irises are opening to really take you in.
74
261400
4016
你仍然无法看到对方眼睛虹膜 是否打开,以此判断是否认可你。
04:25
All of the things that we've done to establish rapport
75
265440
2576
我们历经几十万年演变、
为在彼此之间建立 融洽的关系所做的一切,
04:28
that we've developed over hundreds of thousands of years of evolution,
76
268040
3335
都不好使了,
04:31
they don't work,
77
271399
1217
你无法看到对方是否 与你同呼吸共命运。
04:32
you can't see if someone's breath is syncing up with yours.
78
272640
3016
04:35
So the mirror neurons never fire, the oxytocin never goes through your body,
79
275680
3656
因此,镜像神经元无法兴奋起来, 体内也不会有让人与人变得亲密的催产素,
04:39
you never have that experience of bonding with the other human being.
80
279360
3240
你永远也不会再有与他人 同舟共济的那种经历。
04:43
And instead, you're left like,
81
283360
1456
相反, 你会说,
04:44
"Well, they agreed with me, but did they really,
82
284840
2256
"嗯,他们同意我的看法,
但他们真正理解我的意思吗?"
04:47
did they really get me?"
83
287120
1696
04:48
And we don't blame the technology for that lack of fidelity.
84
288840
3376
而且,我们不去责怪 造成交流失真的技术,
04:52
We blame the other person.
85
292240
1480
反而会去责怪对方。
04:55
You know, even the technologies and the digital initiatives that we have
86
295320
4056
即便是那些用于促进人类发展的
04:59
to promote humans,
87
299400
2176
技术和数字计划,
05:01
are intensely anti-human at the core.
88
301600
2760
其核心也是强烈反人类的。
05:05
Think about the blockchain.
89
305600
2000
就拿区块链来说吧。
05:08
The blockchain is here to help us have a great humanized economy? No.
90
308520
3616
区块链是为了帮我们拥有 高度人性化经济的吗?不!
05:12
The blockchain does not engender trust between users,
91
312160
2696
区块链并不能在用户之间建立信任,
05:14
the blockchain simply substitutes for trust in a new,
92
314880
3536
区块链只是用一种 新的、更不透明的方式
05:18
even less transparent way.
93
318440
2000
取代了用户间的信任。
05:21
Or the code movement.
94
321600
1816
再譬如代码运动。
05:23
I mean, education is great, we love education,
95
323440
2176
教育是伟大的, 我们都热爱教育,
05:25
and it's a wonderful idea
96
325640
1456
这样的理念很棒:
05:27
that we want kids to be able to get jobs in the digital future,
97
327120
3136
现在就教孩子们编码, 以便能在数字未来
05:30
so we'll teach them code now.
98
330280
1600
找到工作。
05:32
But since when is education about getting jobs?
99
332640
2480
但从何时起,教育就是为了找工作?
05:35
Education wasn't about getting jobs.
100
335920
1736
教育并不是为了找工作!
05:37
Education was compensation for a job well done.
101
337680
3976
教育是对出色工作的补偿。
05:41
The idea of public education
102
341680
1576
公共教育的初衷
05:43
was for coal miners, who would work in the coal mines all day,
103
343280
3376
是为了整日在矿山 工作的煤矿工人,
05:46
then they'd come home and they should have the dignity
104
346680
2576
让他们下班回家后,
05:49
to be able to read a novel and understand it.
105
349280
2136
带着人生尊严看看小说,能读懂它;
05:51
Or the intelligence to be able to participate in democracy.
106
351440
2960
或让睿智之人能够参与民主管理。
05:55
When we make it an extension of the job, what are we really doing?
107
355200
3216
当把教育变成工作的延伸时, 我们到底在做什么呢?
05:58
We're just letting corporations really
108
358440
2536
我们只是让公司
真正将培训工人的成本外部化。
06:01
externalize the cost of training their workers.
109
361000
3120
06:05
And the worst of all really is the humane technology movement.
110
365520
4096
而真正最糟糕的 却是人道科技运动。
06:09
I mean, I love these guys, the former guys who used to take
111
369640
2816
其实,我喜欢这帮人, 在早期能将拉斯维加斯的
06:12
the algorithms from Las Vegas slot machines
112
372480
2936
老虎机算法
06:15
and put them in our social media feed so that we get addicted.
113
375440
2936
应用到我们的社交媒体上, 就为了让大家上瘾。
06:18
Now they've seen the error of their ways
114
378400
1936
现在,他们看到了自己方式的错误,
06:20
and they want to make technology more humane.
115
380360
2456
想让技术更加人性化。
06:22
But when I hear the expression "humane technology,"
116
382840
2416
但当我听到 "人道科技" 这个表达时,
06:25
I think about cage-free chickens or something.
117
385280
2856
脑海里浮现的是 散养鸡或类似的东西。
06:28
We're going to be as humane as possible to them,
118
388160
2256
在送它们进屠宰场以前,
06:30
until we take them to the slaughter.
119
390440
1720
我们会尽量人道地对待它们。
06:33
So now they're going to let these technologies be as humane as possible,
120
393200
3416
所以,他们现在要让 这些技术尽可能人性化,
06:36
as long as they extract enough data and extract enough money from us
121
396640
3216
以便从我们身上能源源不断地 提取足够的数据、足够的金钱,
06:39
to please their shareholders.
122
399880
1400
去取悦他们的股东们。
06:42
Meanwhile, the shareholders, for their part, they're just thinking,
123
402520
3176
同时, 就股东们而言, 他们也只是在想:
06:45
"I need to earn enough money now, so I can insulate myself
124
405720
2976
"我现在需要通过这样的 方式去赚足够的钱,
06:48
from the world I'm creating by earning money in this way."
125
408720
3056
以便我能与自己所创的 数字世界隔离开来。
06:51
(Laughter)
126
411800
2376
(笑声)
06:54
No matter how many VR goggles they slap on their faces
127
414200
4056
不管他们往脸上戴多少个VR镜,
06:58
and whatever fantasy world they go into,
128
418280
2256
也不管他们进入怎样的幻想世界,
07:00
they can't externalize the slavery and pollution that was caused
129
420560
3536
均无法将制造这种装置所造成的
07:04
through the manufacture of the very device.
130
424120
2976
对他人的奴役和污染外部化。
07:07
It reminds me of Thomas Jefferson's dumbwaiter.
131
427120
3176
这使我想起了 托马斯 · 杰斐逊的上菜机。
07:10
Now, we like to think that he made the dumbwaiter
132
430320
2336
现在,我们喜欢这样认为,
07:12
in order to spare his slaves all that labor of carrying the food
133
432680
3656
他做上菜机,是为了省去
07:16
up to the dining room for the people to eat.
134
436360
2776
奴隶们把食物带到餐厅 供人享用的所有劳动。
07:19
That's not what it was for, it wasn't for the slaves,
135
439160
2496
实情并非如此,才不是 为了省去奴隶们的劳动呢,
07:21
it was for Thomas Jefferson and his dinner guests,
136
441680
2336
而是为了托马斯 · 杰斐逊 和前来就餐的客人们,
不用再看到奴隶把食物端出来。
07:24
so they didn't have to see the slave bringing the food up.
137
444040
3096
07:27
The food just arrived magically,
138
447160
1576
食物神奇地到达,
07:28
like it was coming out of a "Start Trek" replicator.
139
448760
3000
就像从 "星际迷航" 变换器变出来一样。
07:32
It's part of an ethos that says,
140
452720
2096
这代表一种社会思潮,
07:34
human beings are the problem and technology is the solution.
141
454840
4280
认为人是问题的根本, 技术才是解决方案。
07:40
We can't think that way anymore.
142
460680
2056
我们再也不能这样想了。
07:42
We have to stop using technology to optimize human beings for the market
143
462760
5296
我们必须停止为迎合市场 而使用技术来优化人类,
07:48
and start optimizing technology for the human future.
144
468080
5040
而要开始为人类的未来去优化技术。
07:55
But that's a really hard argument to make these days,
145
475080
2656
07:57
because humans are not popular beings.
146
477760
4056
因为人类太不受欢迎。
08:01
I talked about this in front of an environmentalist just the other day,
147
481840
3376
就在前几天,我在一位 环保人士面前谈到了这一点,
她说:“你为什么要为人类辩护?
08:05
and she said, "Why are you defending humans?
148
485240
2096
是人类毁灭了地球, 他们理应灭绝。”
08:07
Humans destroyed the planet. They deserve to go extinct."
149
487360
2696
08:10
(Laughter)
150
490080
3456
(笑声)
08:13
Even our popular media hates humans.
151
493560
2576
就连我们的大众媒体也讨厌人类。
08:16
Watch television,
152
496160
1256
看看电视就知道了,
08:17
all the sci-fi shows are about how robots are better and nicer than people.
153
497440
3736
所有科幻节目都是关于机器人 如何比人类好、比人类优秀。
08:21
Even zombie shows -- what is every zombie show about?
154
501200
2976
僵尸表演也不例外—— 僵尸表演的内容是什么?
08:24
Some person, looking at the horizon at some zombie going by,
155
504200
3256
有人看着地平线上的僵尸路过,
08:27
and they zoom in on the person and you see the person's face,
156
507480
2896
他们把人拉近放大, 定焦在人脸上,
08:30
and you know what they're thinking:
157
510400
1736
你知道他们在想什么:
"僵尸和我之间有啥区别?
08:32
"What's really the difference between that zombie and me?
158
512160
2736
08:34
He walks, I walk.
159
514920
1536
他能走,我也能走;
08:36
He eats, I eat.
160
516480
2016
他吃东西,我也吃东西;
08:38
He kills, I kill."
161
518520
2280
他杀戮,我也杀戮。”
08:42
But he's a zombie.
162
522360
1536
但他就是个僵尸而已!
08:43
At least you're aware of it.
163
523920
1416
至少,你知道这一点。
08:45
If we are actually having trouble distinguishing ourselves from zombies,
164
525360
3696
如果我们连区分自己和僵尸都做不到了,
08:49
we have a pretty big problem going on.
165
529080
2176
那我们的麻烦可就大了!
08:51
(Laughter)
166
531280
1216
(笑声)
08:52
And don't even get me started on the transhumanists.
167
532520
2936
超人类主义者们的负面陈词 就更是罄竹难书了!
08:55
I was on a panel with a transhumanist, and he's going on about the singularity.
168
535480
3816
曾与我在同一个专家组的一位 超人类主义者是这样谈论奇点的到来:
08:59
"Oh, the day is going to come really soon when computers are smarter than people.
169
539320
3856
“噢, 电脑比人聪明的 那一天将很快到来。
09:03
And the only option for people at that point
170
543200
2136
而那时,人们唯一的选择,
09:05
is to pass the evolutionary torch to our successor
171
545360
3336
就是把进化的火炬 递给我们的继任者,
09:08
and fade into the background.
172
548720
1616
并淡出到幕后。
09:10
Maybe at best, upload your consciousness to a silicon chip.
173
550360
3456
也许,最多只能把你的意识 上传到一个硅芯片上,
09:13
And accept your extinction."
174
553840
1840
并接受自己灭绝的事实。”
09:16
(Laughter)
175
556640
1456
(笑声)
09:18
And I said, "No, human beings are special.
176
558120
3376
而我说:“人类不会灭绝。 人类是特别的。
09:21
We can embrace ambiguity, we understand paradox,
177
561520
3536
我们可以接受歧义、理解悖论,
09:25
we're conscious, we're weird, we're quirky.
178
565080
2616
我们有意识,我们很奇怪、也很古怪。
09:27
There should be a place for humans in the digital future."
179
567720
3336
数字未来应该会有人类的一席之地。 ”
09:31
And he said, "Oh, Rushkoff,
180
571080
1536
他说:“噢, 拉什科夫,
09:32
you're just saying that because you're a human."
181
572640
2296
你这么说不过因为自己是类人罢了。”
09:34
(Laughter)
182
574960
1776
(笑声)
09:36
As if it's hubris.
183
576760
1600
就好像我站着说话不腰疼似的。
09:39
OK, I'm on "Team Human."
184
579280
2800
没错,我就是“人类战队”的一员!
09:43
That was the original insight of the digital age.
185
583200
3856
这就是数字时代的最初见解。
09:47
That being human is a team sport,
186
587080
2216
成为人类是一项团队运动,
09:49
evolution's a collaborative act.
187
589320
2736
进化是一种协作的行为。
09:52
Even the trees in the forest,
188
592080
1416
即使是森林里的树木,
09:53
they're not all in competition with each other,
189
593520
2216
它们也并非全都相互竞争,
09:55
they're connected with the vast network of roots and mushrooms
190
595760
3216
它们与巨大的,盘根错节的 根系及菌类紧密连接,
09:59
that let them communicate with one another and pass nutrients back and forth.
191
599000
4536
使得它们能够相互交流, 并来回传递营养。
10:03
If human beings are the most evolved species,
192
603560
2136
如果说人类是进化最好的物种,
10:05
it's because we have the most evolved ways of collaborating and communicating.
193
605720
4136
那是因为我们进化出了 最高等的合作和交流方式,
10:09
We have language.
194
609880
1496
我们有语言,
10:11
We have technology.
195
611400
1240
我们还有技术。
10:14
It's funny, I used to be the guy who talked about the digital future
196
614120
4576
听起来很可笑,我曾是那个向 尚未体验过任何数字技术的人
10:18
for people who hadn't yet experienced anything digital.
197
618720
2680
描述数字未来的人。
10:22
And now I feel like I'm the last guy
198
622200
1816
而现在,我觉得自己是唯一
10:24
who remembers what life was like before digital technology.
199
624040
2920
记得数字技术之前生活的人。
10:28
It's not a matter of rejecting the digital or rejecting the technological.
200
628680
4216
这不是拒绝数字产品 或拒绝技术产品的问题。
10:32
It's a matter of retrieving the values that we're in danger of leaving behind
201
632920
4096
这是一个恢复频临被抛弃的价值,
10:37
and then embedding them in the digital infrastructure for the future.
202
637040
3600
并将其嵌入未来数字基础架构的问题。
10:41
And that's not rocket science.
203
641880
2296
这不像火箭科学那么高深,
10:44
It's as simple as making a social network
204
644200
2096
这简单的就像
10:46
that instead of teaching us to see people as adversaries,
205
646320
3536
让社交网络教我们 把对手当人看待,
10:49
it teaches us to see our adversaries as people.
206
649880
3000
而不要教我们把人当成对手。
10:54
It means creating an economy that doesn't favor a platform monopoly
207
654240
4296
这意味着,我们要创造的经济应杜绝
只期望从人和地方中攫取 所有价值的平台垄断,
10:58
that wants to extract all the value out of people and places,
208
658560
3336
11:01
but one that promotes the circulation of value through a community
209
661920
3936
而应是一个通过社区 促进价值流通的经济,
11:05
and allows us to establish platform cooperatives
210
665880
2416
以便让我们建立
11:08
that distribute ownership as wide as possible.
211
668320
3816
尽可能广泛分权的平台合作体。
11:12
It means building platforms
212
672160
1656
这意味着要建立这样的平台,
11:13
that don't repress our creativity and novelty in the name of prediction
213
673840
4656
不以预测的名义来压制 我们的创造力和创新性,
11:18
but actually promote creativity and novelty,
214
678520
2576
而是真正地促进创造力和创新性,
11:21
so that we can come up with some of the solutions
215
681120
2336
以便我们能想出一些解决方案,
11:23
to actually get ourselves out of the mess that we're in.
216
683480
2680
真正摆脱我们目前 所处的混乱局面。
11:27
No, instead of trying to earn enough money to insulate ourselves
217
687440
3056
与其尽全力赚取足够的钱
以脱离我们正在创造的世界,
11:30
from the world we're creating,
218
690520
1496
11:32
why don't we spend that time and energy making the world a place
219
692040
3056
不如把这些时间和精力 花在把地球创造成一个
根本没必要逃离的地方呢?
11:35
that we don't feel the need to escape from.
220
695120
2040
11:38
There is no escape, there is only one thing going on here.
221
698000
3200
完全可以不用逃离, 大家需要同舟共济。
11:42
Please, don't leave.
222
702680
2120
拜托大家, 别走!
11:45
Join us.
223
705640
1360
加入我们吧。
11:47
We may not be perfect,
224
707800
1296
11:49
but whatever happens, at least you won't be alone.
225
709120
2440
但无论发生什么, 至少你不用独自面对。
11:52
Join "Team Human."
226
712640
1520
11:55
Find the others.
227
715160
2096
团结其他人,
11:57
Together, let's make the future that we always wanted.
228
717280
2960
让我们携手,一起创造 所憧憬的未来。
12:01
Oh, and those tech billionaires who wanted to know
229
721560
2496
对了, 还有那些想知道
12:04
how to maintain control of their security force after the apocalypse,
230
724080
3256
如何保持对自己安全部队 控制的科技亿万富翁们,
12:07
you know what I told them?
231
727360
1896
知道我是如何回答的吗?
12:09
"Start treating those people with love and respect right now.
232
729280
4216
“从现在开始,用爱 和尊重来对待那些人,
12:13
Maybe you won't have an apocalypse to worry about."
233
733520
2440
也许令你忧心的 世界末日就不会来临。”
12:16
Thank you.
234
736840
1216
谢谢。
(掌声)
12:18
(Applause)
235
738080
4440
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7