How technology changes our sense of right and wrong | Juan Enriquez

114,604 views ・ 2021-02-24

TED


请双击下面的英文字幕来播放视频。

翻译人员: 校对人员: Jiasi Hao
00:13
In an era of extreme polarization,
0
13063
2734
在两极极度分化的时代,
00:15
it's really dangerous to talk about right and wrong.
1
15830
3133
谈论是非十分危险。
00:20
You can be targeted, judged for something you said 10 years ago, 10 months ago,
2
20263
5300
无论是你在十年前、 十个月前、十小时前,
00:25
10 hours ago, 10 seconds ago.
3
25563
2734
还是十秒前说过的话, 都会被人针对,遭人议论。
00:28
And that means that those who think you're wrong
4
28330
2267
这意味着 那些认为“你是错的”的人
00:30
may burn you at the stake
5
30630
1267
会让你如临炮烙之刑。
00:31
or those who are on your side
6
31930
2133
而那些与你相同阵营
00:34
that think you're not sufficiently orthodox
7
34063
2234
却认为你站队不够彻底的人
00:36
may try and cancel you.
8
36330
1300
会试图将你封杀。
00:37
As you're thinking about right and wrong, I want you to consider three ideas.
9
37663
3667
如果你要思考是与非, 我希望你思考三个点。
00:41
What if right and wrong is something that changes over time.
10
41963
3667
假如是非概念会随着 时间流逝而变化,
00:46
What if right and wrong is something that can change because of technology.
11
46530
4033
假如是非概念会因为 科技进步而变化,
00:50
What if technology is moving exponentially?
12
50563
2600
假如科技会呈指数级发展, 会怎么样?
00:53
So as you're thinking about this concept,
13
53930
2167
当你在思考这个概念时,
00:56
remember human sacrifice used to be normal and natural.
14
56130
2933
别忘了活人献祭 也曾被认为是自然常态,
00:59
It was a way of appeasing the gods.
15
59063
1867
一种绥靖众神的方式。
01:00
Otherwise the rain wouldn't come,
16
60963
1767
否则雨水不至、
01:02
the sun wouldn't shine.
17
62763
1700
太阳无辉。
01:04
Public executions.
18
64497
1566
示众处刑
01:06
They were common, normal, legal.
19
66063
2567
也是常事,合规合法。
01:08
You used to take your kids to watch beheadings in the streets of Paris.
20
68663
4100
曾经的人们还会带着孩子, 在巴黎街头观看斩首处刑。
01:12
One of the greatest wrongs, slavery,
21
72797
3133
其中一项极恶:奴役制。
01:15
indentured servitude,
22
75963
1934
契约奴隶,
01:17
that was something that was practiced for millennia.
23
77930
2633
实施了上千年的制度
01:21
It was practiced across the Incas, the Mayas, the Chinese,
24
81163
4434
曾经奴役着 印加人、玛雅人、中国人,
01:25
the Indians in North and South America.
25
85630
3900
以及南北美洲的印第安人。
01:29
And as you're thinking about this,
26
89530
1700
当你在思考时,
01:31
one question is why did something so wrong last for so long?
27
91263
4667
要问个问题:为何如此恶劣的制度 会延续这么长的时间?
01:36
And a second question is: why did it go away?
28
96663
3267
第二个问题: 它又为什么消失了?
01:39
And why did it go away in a few short decades in legal terms?
29
99963
3900
为什么在短短几十年间 它就不合法了?
01:43
Certainly there was a work
30
103863
1700
这当然少不了那些伟大的、
01:45
by extraordinary abolitionists who risked their lives,
31
105563
3700
奋不顾身的废奴主义者的努力,
01:49
but there may be something else happening alongside these brave abolitionists.
32
109297
4666
但这些勇敢的废奴主义者 也许并不是仅有的原因。
01:54
Consider energy and the industrial revolution.
33
114663
3634
想一想能源与工业革命。
01:58
A single barrel of oil contains the energy equivalent
34
118330
4067
单单一桶油足以储备
五到十人劳动力的能量。
02:02
of the work of five to 10 people.
35
122430
2567
02:05
Add that to machines,
36
125930
1833
配以机械,
02:07
and suddenly you've got millions of people's equivalent labor
37
127797
4466
你就陡然拥有了等同于 数百万人的劳动力,
02:12
at your disposal.
38
132297
1833
任你支配。
02:14
You can quit oppressing people and have a doubling in lifespan
39
134163
4900
你可以不再压迫人民
让人类的寿限在停滞不变了 上千年后出现倍增。
02:19
after a flattened lifespan for millennia.
40
139063
3467
02:22
The world economy, which had been flat for millennia,
41
142530
3067
世界经济水平, 在停滞不变了上千年后
02:25
all of a sudden explodes.
42
145630
1900
也都出现了暴增。
02:27
And you get enormous amounts of wealth and food and other things
43
147530
4533
你可以得到丰厚的财富和粮食等产出,
02:32
produced by far fewer hands.
44
152063
2767
而生产所需的劳动力却大幅减少了。
02:35
Technology changes the way we interact with each other in fundamental ways.
45
155797
4266
科技从根本上 改变了人与人之间的交互方式。
02:40
New technologies like the machine gun
46
160063
2234
新技术如机关枪
02:42
completely changed the nature of warfare in World War I.
47
162330
3767
在第一次世界大战 彻底改变了战争的本质。
02:46
It drove people into trenches.
48
166130
2067
肉体对抗成了战壕对抗。
02:48
You were in the British trench, or you were in the German trench.
49
168197
3100
你要么身在英军战壕, 要么身在德军战壕。
02:51
Anything in between was no man's land.
50
171330
2367
双方战壕间的区域是一片不毛之地。
02:53
You entered no man's land.
51
173697
1533
你若步入其中,
02:55
You were shot. You were killed.
52
175263
2034
就会被枪击,被射杀。
02:57
You tried to leave the trench in the other direction.
53
177330
2733
而你若试图朝另一个方向 逃离战壕,
和你同阵营的人就会向你开枪,
03:00
Then your own side would shoot you
54
180063
1700
03:01
because you were a deserter.
55
181797
1333
因为你是逃兵。
03:03
In a weird way, today's machine guns are narrowcast social media.
56
183963
5067
在某种程度上,窄播特定群体的 社交媒体就是当代的机关枪。
03:09
We're shooting at each other.
57
189030
1500
我们相互射击,
03:10
We're shooting at those we think are wrong
58
190530
2067
向那些认为我们不对的人开枪。
03:12
with posts, with tweets, with photographs, with accusations, with comments.
59
192630
4900
贴文、博客、照片、指控、评论 就是我们的子弹。
03:17
And what it's done is it's created these two trenches
60
197530
3100
这样的后果 就是产生了两道战壕,
03:20
where you have to be either in this trench or that trench.
61
200663
3934
而你必须选择为其一而战。
03:24
And there's almost no middle ground to meet each other,
62
204630
3100
战壕之间几乎没有中立之地,
03:27
to try and find some sort of a discussion between right and wrong.
63
207763
5534
无法展开是非对错的讨论。
03:33
As you drive around the United States, you see signs on lawns.
64
213330
4633
如果你在美国开车, 你会看草坪上立着的标语。
03:37
Some say, "Black Lives Matter."
65
217997
2033
有的写着:“黑人的命也是命”,
03:40
Others say, "We support the police."
66
220030
2500
有的写着:“我们支持警察”。
03:42
You very rarely see both signs on the same lawn.
67
222530
4000
但你鲜少能看见 这两种标语立在一块。
03:47
And yet if you ask people,
68
227197
1900
而如果你直接向人提问,
03:49
most people would probably support Black Lives Matter
69
229130
2933
大部分人应该即会支持 “黑人的命也是命”,
03:52
and they would also support their police.
70
232063
2867
也会支持他们的警察。
03:54
So as you think of these polarized times,
71
234963
2134
所以当你想到这些两极分化的时期,
03:57
as you think of right and wrong,
72
237130
1633
当你想到这些是非对错,
03:58
you have to understand that right and wrong changes
73
238797
2766
你必须明白是与非都会转变,
04:01
and is now changing in exponential ways.
74
241563
2967
而现在这种转变也在呈指数变化。
04:04
Take the issue of gay marriage.
75
244530
1933
拿同性婚姻来说。
04:06
In 1996, two-thirds of the US population was against gay marriage.
76
246497
5133
1996 年,三分之二的美国人 反对同性婚姻。
04:11
Today two-thirds is for.
77
251663
2100
而今天的三分之二表示支持。
04:13
It's almost 180-degree shift in the opinion.
78
253797
4233
近乎于 180 度的态度转变。
04:18
In part, this is because of protests,
79
258030
2333
部分因素, 在于有人抗议,
04:20
because people came out of the closet,
80
260363
2234
在于有人公开出柜,
04:22
because of AIDS,
81
262630
1933
以及艾滋病的出现,
04:24
but a great deal of it has to do with social media.
82
264563
2967
但很大一部分原因 必然和社交媒体有关。
04:27
A great deal of it has to do with people out in our homes,
83
267530
4100
很大一部分原因必然在于
人们可以在自己的家中、卧室里, 通过电视、电影、贴文,
04:31
in our living rooms, through television, through film, through posts,
84
271663
4700
04:36
through people being comfortable enough,
85
276363
2534
通过自己觉得能自在相处的人,
04:38
our friends, our neighbors, our family,
86
278930
2967
通过朋友、邻居、家人
04:41
to say, "I'm gay."
87
281930
1833
来发声:“我是同志”。
04:43
And this has shifted opinion
88
283797
1633
甚至连最为保守的地域
04:45
even in some of the most conservative of places.
89
285463
3400
也因此转变了态度。
04:48
Take the Pope.
90
288863
1434
拿教皇来说。
04:50
As Cardinal in 2010,
91
290330
1733
2010 年,在他还是红衣主教时,
04:52
he was completely against gay marriage.
92
292063
2534
他彻底反对同性婚姻。
04:54
He becomes Pope.
93
294630
1333
他后来当了教皇。
04:55
And three years after the last sentence
94
295997
3300
当上教皇的三年后
04:59
he comes out with "Who am I to judge?"
95
299330
2367
他对外称:“我有什么资格评论?”
05:01
And then today, he's in favor of civil unions.
96
301697
4000
而今天,他赞同各式民事结合。
05:05
As you're thinking about technology changing ethics,
97
305697
3200
当你想到科技改变道德准则,
05:08
you also have to consider that technology is now moving exponentially.
98
308930
3833
你也得考虑到科技如今 也在呈指数级变化。
05:12
As right and wrong changes,
99
312797
2133
随着是与非的转变,
05:14
if you take the position, "I know right.
100
314963
2634
如果你的立场是: 我知道什么是对的。
05:17
And if you completely disagree with me, if you partially disagree with me,
101
317630
3567
如果你彻底反对我, 如果你有所选择地反对我,
05:21
if you even quibble with me, then you're wrong,"
102
321197
2433
如果你对我稍有非议, 你就是错的。
05:23
then there's no discussion,
103
323663
1367
那就不会有讨论、
不会有容忍,或进步, 自然也没有学习了。
05:25
no tolerance, no evolution, and certainly no learning.
104
325030
3500
05:28
Most of us are not vegetarians yet.
105
328530
2267
我们大部分人都还不是素食主义者。
05:31
Then again, we haven't had
106
331830
1400
再次,我们以前也没有多少
05:33
a whole lot of faster, better, cheaper alternatives to meat.
107
333263
3767
更便捷、更优质、 更廉价的肉类替代品,
05:37
But now that we're getting synthetic meats,
108
337030
2533
但现在我们有合成肉类。
05:39
as the price drops from 380,000 dollars in 2013
109
339563
4467
其价格从 2013 年的 38 万美元
05:44
to 9 dollars today,
110
344030
1333
降到了如今的 9 美元,
05:46
a great big chunk of people
111
346530
1433
许多人将因此
05:47
are going to start becoming vegetarian or quasi-vegetarian.
112
347997
4466
开始成为素食或准素食主义者。
05:52
And then in retrospect, these pictures
113
352497
2566
再回看过去的照片,
05:55
of walking into the fanciest, most expensive restaurants in town
114
355063
4134
人们走进城里最奢华、 最昂贵的餐厅,
05:59
and walking past racks of bloody steaks
115
359197
3900
经过一排排汁血淋漓的牛排。
06:03
is going to look very different in 10 years, in 20 years and 30 years.
116
363130
5267
10年、20年、30年后再来看, 人们都会对这场景有不同看法。
06:08
In these polarized times,
117
368430
2333
在这两极分化的时代,
06:10
I'd like to revive two words you rarely hear today:
118
370797
3900
我想重拾两个如今鲜少听到的词:
06:14
humility and forgiveness.
119
374697
1866
人性和原谅。
06:17
When you judge the past, your ancestors, your forefathers,
120
377663
3934
当你评论过去, 评论父辈和先人时,
06:21
do so with a little bit more humility,
121
381630
2100
要怀有更多的人性。
06:23
because perhaps if you'd been educated in that time,
122
383763
3167
因为如果你也受教于那个时代,
06:26
if you'd lived in that time,
123
386963
1634
如果你也生活在那个时代,
06:28
you would've done a lot of things wrong.
124
388630
2333
你也会做出许多错事。
06:30
Not because they're right.
125
390997
1700
不是因为他们是对的,
06:32
Not because we don't see they're wrong today,
126
392697
2700
不是因为 我们现在认为他们是对的,
06:35
but simply because our notions,
127
395430
2100
仅仅只是因为我们的观念、
06:37
our understanding of right and wrong change across time.
128
397530
3533
我们的是非观随着时间转变了。
06:41
The second word, forgiveness.
129
401997
2200
第二个词,原谅。
06:44
Forgiveness is incredibly important these days.
130
404197
3033
原谅在当今极其重要。
06:47
You cannot cancel somebody for saying the wrong word,
131
407263
3967
不能因为某个人说错了话、 或是在十年前犯了错,
06:51
for having done something 10 years ago,
132
411263
2634
或是因为你觉得此人
06:53
for having triggered you and not being a hundred percent right.
133
413930
3700
无法保持百分百的正确 而将此人封杀。
06:57
To build a community, you have to build it and talk to people
134
417663
3467
要创建一个社群,
你必须在创建的同时 和那些观点跟你千差万别的人
07:01
and learn from people
135
421163
1367
07:02
who may have very different points of view from yours.
136
422530
2933
保持对话和学习。
07:05
You have to allow them a space
137
425497
2566
你必须为他们保留空间,
07:08
instead of creating a no man's land.
138
428063
2667
而非创造一片不毛之地。
07:10
A middle ground, a ??? and a space of empathy.
139
430763
3600
一个中立之地, 一个能相互感同身受的空间。
07:15
This is a time to build community.
140
435130
1833
这是一个创建社群的时代,
07:16
This is not a time to continue ripping nations apart.
141
436997
3233
不是继续民族分化的时代。
07:20
Thank you very much.
142
440263
1234
非常感谢。
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7