Jeff Hancock: 3 types of (digital) lies

91,279 views ・ 2012-11-09

TED


请双击下面的英文字幕来播放视频。

00:00
Translator: Joseph Geni Reviewer: Morton Bast
0
0
7000
翻译人员: Ben Peng 校对人员: Ricki Xie
00:15
Let me tell you, it has been a fantastic month for deception.
1
15641
3713
我和你们说,这个月的骗局可真不少。
00:19
And I'm not even talking about the American presidential race. (Laughter)
2
19354
4253
我甚至不是在谈论美国的总统竞选。(笑声)
00:23
We have a high-profile journalist caught for plagiarism,
3
23607
4335
一个知名的记者因为抄袭被抓起来了;
00:27
a young superstar writer whose book involves
4
27942
2932
一个年轻著名作家所写的书
00:30
so many made up quotes that they've pulled it from the shelves;
5
30874
3305
因捏造了很多旁人引述而被下架了;
00:34
a New York Times exposé on fake book reviews.
6
34179
2598
纽约时报出现了虚假书评。
00:36
It's been fantastic.
7
36777
1409
不少吧。
00:38
Now, of course, not all deception hits the news.
8
38186
3851
当然,不是所有的骗局都会上新闻。
00:42
Much of the deception is everyday. In fact, a lot of research
9
42037
3679
很多骗局是每天都会发生的。事实上,很多研究
00:45
shows that we all lie once or twice a day, as Dave suggested.
10
45716
4331
显示我们每天撒谎一到两次,正如Dave说的。
00:50
So it's about 6:30 now, suggests that most of us should have lied.
11
50047
2933
现在约六点三十分,意味着我们大多数人都撒了谎。
00:52
Let's take a look at Winnipeg. How many of you,
12
52980
1900
让我们看一看,你们中有多少人
00:54
in the last 24 hours -- think back -- have told a little fib,
13
54880
2927
在过去的24小时里——回想一下,撒了小谎,
00:57
or a big one? How many have told a little lie out there?
14
57807
3551
甚至说了弥天大谎?这里有谁说了小谎呢?
01:01
All right, good. These are all the liars.
15
61358
1904
好吧,很好,这些都是撒谎的人。
01:03
Make sure you pay attention to them. (Laughter)
16
63262
3293
确保你们要留心他们。(笑声)
01:06
No, that looked good, it was about two thirds of you.
17
66555
2146
不错,大约有三分之二的人。
01:08
The other third didn't lie, or perhaps forgot,
18
68701
2852
剩下那三分之一的人没有撒谎,或也许是忘了,
01:11
or you're lying to me about your lying, which is very,
19
71553
2660
或是你们对我撒谎,说你们没撒谎,这真的非常,
01:14
very devious. (Laughter) This fits with a lot of the research,
20
74213
4050
非常狡猾。(笑声)这正好符合了大量的研究,
01:18
which suggests that lying is very pervasive.
21
78263
3354
表明撒谎是无处不在的。
01:21
It's this pervasiveness, combined with the centrality
22
81617
3961
正是它的普遍性结合
01:25
to what it means to be a human, the fact that we can
23
85578
2440
人类以自我为中心的特点——我们可以
01:28
tell the truth or make something up,
24
88018
1880
说实话,或编些谎言
01:29
that has fascinated people throughout history.
25
89898
2851
长久以来都让人们为之着迷。
01:32
Here we have Diogenes with his lantern.
26
92749
2629
有个典故,第欧根尼和他的灯笼。
01:35
Does anybody know what he was looking for?
27
95378
2680
有人知道他在找什么吗?
01:38
A single honest man, and he died without finding one
28
98058
3784
一个老实人,直到死去在希腊都找不到另一个老实人
01:41
back in Greece. And we have Confucius in the East
29
101842
3017
在东方我们有孔子,
01:44
who was really concerned with sincerity,
30
104859
2377
他所关心的真挚
01:47
not only that you walked the walk or talked the talk,
31
107236
3084
不仅能说到做到,
01:50
but that you believed in what you were doing.
32
110320
3154
还要坚信自己在做的事。
01:53
You believed in your principles.
33
113474
2006
坚信自己的原则。
01:55
Now my first professional encounter with deception
34
115480
2931
我第一次在工作上面对骗局,
01:58
is a little bit later than these guys, a couple thousand years.
35
118411
3463
比这些人要晚一些,晚了几千年。
02:01
I was a customs officer for Canada back in the mid-'90s.
36
121874
3799
90年代中期,我那时是加拿大的海关人员。
02:05
Yeah. I was defending Canada's borders.
37
125673
2826
是的,我那时在捍卫加拿大边境。
02:08
You may think that's a weapon right there. In fact,
38
128499
3782
你可能会觉得我是拿着武器的。事实上,
02:12
that's a stamp. I used a stamp to defend Canada's borders. (Laughter)
39
132281
5030
我拿的是印章。我通过印章来捍卫加拿大的边境。(笑声)
02:17
Very Canadian of me. I learned a lot about deception
40
137311
3537
我太有加拿大特色了。我学到了很多关于骗局的事,
02:20
while doing my duty here in customs,
41
140848
3055
正是在我在海关工作的时候。
02:23
one of which was that most of what I thought I knew about deception was wrong,
42
143903
2884
其中一个就是,我对骗局的了解大部分都是错误的
02:26
and I'll tell you about some of that tonight.
43
146787
1752
今晚我会和你们分享其中的一些。
02:28
But even since just 1995, '96, the way we communicate
44
148539
4074
自从1995,1996年,我们的沟通方式
02:32
has been completely transformed. We email, we text,
45
152613
3297
已经完全改变,我们发电邮,发信息
02:35
we skype, we Facebook. It's insane.
46
155910
2613
我们聊skype,我们玩Facebook。这简直是疯了。
02:38
Almost every aspect of human communication's been changed,
47
158523
3261
几乎人类沟通的每一个方面都已经发生了变化,
02:41
and of course that's had an impact on deception.
48
161784
2560
当然这些对骗局的类型也有所影响。
02:44
Let me tell you a little bit about a couple of new deceptions
49
164344
2583
让我告诉你一些我们一直在做跟踪和记录的
02:46
we've been tracking and documenting.
50
166927
2376
一些新的骗局类型
02:49
They're called the Butler, the Sock Puppet
51
169303
4244
有三种,管家模式,马甲模式
02:53
and the Chinese Water Army.
52
173547
2081
中国水军模式。
02:55
It sounds a little bit like a weird book,
53
175628
1897
这些听起来像是奇怪的书名,
02:57
but actually they're all new types of lies.
54
177525
2133
但其实他们是新的谎言类型。
02:59
Let's start with the Butlers. Here's an example of one:
55
179658
3045
先从管家模式开始,这里有个例子。
03:02
"On my way." Anybody ever written, "On my way?"
56
182703
3113
“在路上”,所有人都这么用过,“在路上?”
03:05
Then you've also lied. (Laughter)
57
185816
3763
你是撒了谎吧。(笑声)
03:09
We're never on our way. We're thinking about going on our way.
58
189579
4197
我们还没有在路上呢,我们只是想着要出门。
03:13
Here's another one: "Sorry I didn't respond to you earlier.
59
193776
2763
这里还有一个:“抱歉,我没有早点给你回复。
03:16
My battery was dead." Your battery wasn't dead.
60
196539
1965
我的电池‘死’光了。”你的电池才没用完呢。
03:18
You weren't in a dead zone.
61
198504
1876
你不是在阵亡区
03:20
You just didn't want to respond to that person that time.
62
200380
1953
你只是那时不想回复那个人罢了。
03:22
Here's the last one: You're talking to somebody,
63
202333
1797
最后一个:你和某人在聊天,
03:24
and you say, "Sorry, got work, gotta go."
64
204130
2490
然后你说,“抱歉,要工作了,挂了。”
03:26
But really, you're just bored. You want to talk to somebody else.
65
206620
3797
实际上是你觉得没意思,你想找别的人聊天。
03:30
Each of these is about a relationship,
66
210417
2416
我提到的每一条都是和人际关系有关的。
03:32
and this is a 24/7 connected world. Once you get my cell phone number,
67
212833
4405
现在是24小时紧密连接的世界。一旦你有了我的手机号码,
03:37
you can literally be in touch with me 24 hours a day.
68
217238
2965
你可以说是一天24小时都能和我联系。
03:40
And so these lies are being used by people
69
220203
2369
这些谎言都被人们用作
03:42
to create a buffer, like the butler used to do,
70
222572
2826
缓冲物,就像管家一样,
03:45
between us and the connections to everybody else.
71
225398
3407
缓冲着我们和其他人的关系。
03:48
But they're very special. They use ambiguity
72
228805
1707
它们很特别,因为它们含糊不清,
03:50
that comes from using technology. You don't know
73
230512
2061
用技术为借口。你不知道
03:52
where I am or what I'm doing or who I'm with.
74
232573
2948
我在哪里或我在做什么或我和谁在一起。
03:55
And they're aimed at protecting the relationships.
75
235521
2491
它们旨在保护人际关系。
03:58
These aren't just people being jerks. These are people
76
238012
2581
这些人并不是在做混蛋,这些人
04:00
that are saying, look, I don't want to talk to you now,
77
240593
2376
说,听着,我现在不想和你说话,
04:02
or I didn't want to talk to you then, but I still care about you.
78
242969
2424
或我那时不想和你说话,但我还是会关心你。
04:05
Our relationship is still important.
79
245393
2400
我们的关系还是很重要。
04:07
Now, the Sock Puppet, on the other hand,
80
247793
1514
现在谈谈马甲模式。
04:09
is a totally different animal. The sock puppet isn't
81
249307
2343
这是完全不同的方式。马甲模式不是利用
04:11
about ambiguity, per se. It's about identity.
82
251650
3065
语言的模棱两可,它是关于身份的。
04:14
Let me give you a very recent example,
83
254715
2002
让我给你们讲一个最近的例子。
04:16
as in, like, last week.
84
256717
1514
比方说,上个星期,
04:18
Here's R.J. Ellory, best-seller author in Britain.
85
258231
3268
有一个R•J•埃洛里,他是英国的畅销书作家。
04:21
Here's one of his bestselling books.
86
261499
2020
这是他其中一本畅销书。
04:23
Here's a reviewer online, on Amazon.
87
263519
3413
这是亚马逊的在线书评。
04:26
My favorite, by Nicodemus Jones, is,
88
266932
2657
我的最爱,写自尼科德摩•琼斯,
04:29
"Whatever else it might do, it will touch your soul."
89
269589
3808
“这本书绝无仅有,它会触动你的灵魂。”
04:33
And of course, you might suspect
90
273397
1403
当然,你也许会怀疑
04:34
that Nicodemus Jones is R.J. Ellory.
91
274800
2627
尼科德摩•琼斯就是R•J•埃洛里。
04:37
He wrote very, very positive reviews about himself. Surprise, surprise.
92
277427
4687
他给自己写了非常非常好的书评。真叫人意外,意外。
04:42
Now this Sock Puppet stuff isn't actually that new.
93
282114
3260
马甲模式其实不是最新的,
04:45
Walt Whitman also did this back in the day,
94
285374
3167
沃尔特 • 惠特曼很早就这么做了。
04:48
before there was Internet technology. Sock Puppet
95
288541
3055
那时互联网技术还没出现。马甲模式
04:51
becomes interesting when we get to scale,
96
291596
2768
一旦上了规模,就变得有趣起来了,
04:54
which is the domain of the Chinese Water Army.
97
294364
2518
这就是中国水军模式。
04:56
Chinese Water Army refers to thousands of people
98
296882
2436
中国水军模式指的是成千上万的人,
04:59
in China that are paid small amounts of money
99
299318
3048
他们在中国收取小额费用,
05:02
to produce content. It could be reviews. It could be
100
302366
3034
来编造内容。可以是评论,可以是
05:05
propaganda. The government hires these people,
101
305400
2559
宣传。政府雇用这些人,
05:07
companies hire them, all over the place.
102
307959
2628
公司雇用这些人,到处都有。
05:10
In North America, we call this Astroturfing,
103
310587
3617
在北美,我们把它叫作造势,
05:14
and Astroturfing is very common now. There's a lot of concerns about it.
104
314204
3438
现在造势很普遍。引起了很多关注。
05:17
We see this especially with product reviews, book reviews,
105
317642
3227
造势很常见,尤其在产品评价,书评,
05:20
everything from hotels to whether that toaster is a good toaster or not.
106
320869
4795
一切东西,从酒店到吐司机好不好用。
05:25
Now, looking at these three reviews, or these three types of deception,
107
325664
3918
现在看看这三种评论,或者这三种骗局类型,
05:29
you might think, wow, the Internet is really making us
108
329582
2737
你也许会想,哇,互联网真的把我们变成了
05:32
a deceptive species, especially when you think about
109
332319
3209
爱说谎的物种,尤其当你想到
05:35
the Astroturfing, where we can see deception brought up to scale.
110
335528
4602
造势,也就是达到一定规模的骗局。
05:40
But actually, what I've been finding is very different from that.
111
340130
4738
可事实上,我发现的东西并不是如此。
05:44
Now, let's put aside the online anonymous sex chatrooms,
112
344868
3249
现在,让我们先抛开在线匿名性聊天室,
05:48
which I'm sure none of you have been in.
113
348117
1899
我肯定你们中没有人参与过,
05:50
I can assure you there's deception there.
114
350016
2329
我保证这里面一定有人撒谎。
05:52
And let's put aside the Nigerian prince who's emailed you
115
352345
2709
让我们抛开尼日利亚王子给你写邮件说
05:55
about getting the 43 million out of the country. (Laughter)
116
355054
3228
想把4300万元转移出国。(笑声)
05:58
Let's forget about that guy, too.
117
358282
1680
我们也别提那个人了。
05:59
Let's focus on the conversations between our friends
118
359962
2944
让我们一起关注我们和朋友间的对话
06:02
and our family and our coworkers and our loved ones.
119
362906
2147
和我们亲人,我们的同事,心爱的人的对话。
06:05
Those are the conversations that really matter.
120
365053
2408
那些才是真正重要的对话。
06:07
What does technology do to deception with those folks?
121
367461
4240
技术在欺骗这些人的时候起了什么作用呢?
06:11
Here's a couple of studies. One of the studies we do
122
371701
3075
这里有一些研究,其中一个研究,我们
06:14
are called diary studies, in which we ask people to record
123
374776
3371
叫作日志研究,我们让人们记录
06:18
all of their conversations and all of their lies for seven days,
124
378147
3566
七天里他们所有的日常对话,和说过的所有谎话,
06:21
and what we can do then is calculate how many lies took place
125
381713
3105
然后我们计算同一种媒介里的每段谈话
06:24
per conversation within a medium, and the finding
126
384818
2948
一共有多少谎言,我们发现
06:27
that we get that surprises people the most is that email
127
387766
2524
最让我们很吃惊的是,人们使用电子邮件时
06:30
is the most honest of those three media.
128
390290
3279
是在使用三种媒介中最诚实的。
06:33
And it really throws people for a loop because we think,
129
393569
2401
这真让人意想不到是因为我们认为,
06:35
well, there's no nonverbal cues, so why don't you lie more?
130
395970
3736
写邮件并没有任何口头线索,为什么不会更容易撒谎呢?
06:39
The phone, in contrast, the most lies.
131
399706
4304
相比之下,电话里有最多的谎言。
06:44
Again and again and again we see the phone is the device
132
404010
1946
一次又一次地,我们发现人们使用电话这种设备时
06:45
that people lie on the most, and perhaps because of the Butler Lie ambiguities I was telling you about.
133
405956
4718
说的谎话最多,也许是因为之前我告诉过你们的那个管家模式的含糊语言。
06:50
This tends to be very different from what people expect.
134
410674
3975
这真的和人们的预期很不一样。
06:54
What about résumés? We did a study in which we had
135
414649
3224
那么简历又如何呢?我们做了个研究,
06:57
people apply for a job, and they could apply for a job
136
417873
2544
让人们申请工作,他们可以
07:00
either with a traditional paper résumé, or on LinkedIn,
137
420417
3514
通过传统的纸质简历申请,或通过LinkedIn申请
07:03
which is a social networking site like Facebook,
138
423931
2822
也就是类似Facebook的社交网络,
07:06
but for professionals -- involves the same information as a résumé.
139
426753
3567
但它是专门给专业人士的—要填写简历一样的内容。
07:10
And what we found, to many people's surprise,
140
430320
2614
我们发现,让很多人意外的是,
07:12
was that those LinkedIn résumés were more honest
141
432934
2795
在LinkedIn上的简历倾向于
07:15
on the things that mattered to employers, like your
142
435729
1824
在雇主看重的问题上有着更诚实的信息,
07:17
responsibilities or your skills at your previous job.
143
437553
4151
比如有关你上一份工作的工作职责和个人技能。
07:21
How about Facebook itself?
144
441704
2296
Facebook本身又如何呢?
07:24
You know, we always think that hey, there are these
145
444000
1882
我们常想,在这些
07:25
idealized versions, people are just showing the best things
146
445882
2129
理想化的情形下,人们只是在展示他们生活中最好的东西。
07:28
that happened in their lives. I've thought that many times.
147
448011
2656
我想了又想。
07:30
My friends, no way they can be that cool and have good of a life.
148
450667
3068
我的朋友们不可能那么酷,过着这么好的生活。
07:33
Well, one study tested this by examining people's personalities.
149
453735
3821
有一个研究通过分析人们的个性来判断这一点。
07:37
They had four good friends of a person judge their personality.
150
457556
4218
先找来被测试对象的四个好朋友,让他们判断这个人的个性。
07:41
Then they had strangers, many strangers,
151
461774
1956
再让陌生人,找来很多陌生人,
07:43
judge the person's personality just from Facebook,
152
463730
2528
通过被测试者的Facebook页面来判断这个人的个性。
07:46
and what they found was those judgments of the personality
153
466258
2429
结果是这些人对这个人的个性判断
07:48
were pretty much identical, highly correlated,
154
468687
2509
非常一致,高度相关,
07:51
meaning that Facebook profiles really do reflect our actual personality.
155
471196
4373
意味着Facebook页面真的反映了人的真实个性。
07:55
All right, well, what about online dating?
156
475569
2572
好,那么在线约会如何?
07:58
I mean, that's a pretty deceptive space.
157
478141
1500
这里面有很大的欺骗空间。
07:59
I'm sure you all have "friends" that have used online dating. (Laughter)
158
479641
3535
我肯定你们全都有“朋友”试过在线约会。(笑声)
08:03
And they would tell you about that guy that had no hair
159
483176
2058
那个男士出现的时候才知道他秃头了,
08:05
when he came, or the woman that didn't look at all like her photo.
160
485234
3030
这个女士和她的档案照片完全不像。
08:08
Well, we were really interested in it, and so what we did
161
488264
3136
我们对此非常有兴趣,于是我们就,
08:11
is we brought people, online daters, into the lab,
162
491400
3107
我们把这些在线约会的人们,带进实验室。
08:14
and then we measured them. We got their height
163
494507
1480
我们测量了他们,我们让他们靠墙站着量身高,
08:15
up against the wall, we put them on a scale, got their weight --
164
495987
3881
让他们站在秤上量体重 ——
08:19
ladies loved that -- and then we actually got their driver's license to get their age.
165
499868
3895
女士们可“喜欢”这么做了——然后,我们拿到他们的驾照得知他们的年龄。
08:23
And what we found was very, very interesting.
166
503763
4311
我们发现了非常非常有趣的事。
08:28
Here's an example of the men and the height.
167
508074
3929
下面是一个例子,关于男士和他们的身高。
08:32
Along the bottom is how tall they said they were in their profile.
168
512003
2470
水平轴是他们在介绍里宣称的高度,
08:34
Along the y-axis, the vertical axis, is how tall they actually were.
169
514473
4862
Y轴,垂直轴,是他们真实的高度
08:39
That diagonal line is the truth line. If their dot's on it,
170
519335
3076
对角线是真理线。如果他们的点在线上,
08:42
they were telling exactly the truth.
171
522411
1554
他们讲的完全是实话。
08:43
Now, as you see, most of the little dots are below the line.
172
523965
3113
你们可以看到,大部分的小点都落在线下面
08:47
What it means is all the guys were lying about their height.
173
527078
2867
也就是说所有的男士都谎报了自己的身高。
08:49
In fact, they lied about their height about nine tenths of an inch,
174
529945
2941
事实上,他们会在0.9寸上撒谎,
08:52
what we say in the lab as "strong rounding up." (Laughter)
175
532886
6276
说是“强进制”(笑声)
08:59
You get to 5'8" and one tenth, and boom! 5'9".
176
539162
4503
如果你是5尺8寸又0.1寸,啪!就是5尺9寸。
09:03
But what's really important here is, look at all those dots.
177
543665
1998
但这里真正重要的是,看看所有那些点。
09:05
They are clustering pretty close to the truth. What we found
178
545663
2566
他们是聚集在离实话非常近的地方。我们发现
09:08
was 80 percent of our participants did indeed lie
179
548229
2408
80%的参与者确实是撒了谎,
09:10
on one of those dimensions, but they always lied by a little bit.
180
550637
3595
在这个层面上是如此,但他们却只是撒了一点点谎。
09:14
One of the reasons is pretty simple. If you go to a date,
181
554232
3024
其中一个原因是很简单。如果你出去约会,
09:17
a coffee date, and you're completely different than what you said,
182
557256
3601
去喝咖啡,你完全不是你所说的那个人,
09:20
game over. Right? So people lied frequently, but they lied
183
560857
3619
游戏结束了,不是吗?所以虽然人们常撒谎,但他们
09:24
subtly, not too much. They were constrained.
184
564476
3469
只是撒点小谎,不会差太多。人们很节制。
09:27
Well, what explains all these studies? What explains the fact
185
567945
2887
这些研究都说明了什么? 为什么
09:30
that despite our intuitions, mine included,
186
570832
4635
虽然我们的本性如此(会撒谎),包括我的,
09:35
a lot of online communication, technologically-mediated
187
575467
3529
然而很多的在线沟通,技术媒介沟通
09:38
communication, is more honest than face to face?
188
578996
4028
比面对面的交流更诚实?
09:43
That really is strange. How do we explain this?
189
583024
2489
这真的和奇怪。我们该如何解释呢?
09:45
Well, to do that, one thing is we can look at the deception-detection literature.
190
585513
3379
要做到这一点,我们可以看看揭谎研究。
09:48
It's a very old literature by now, it's coming up on 50 years.
191
588892
4345
这是很久以前的研究了,大概是在50年前开始出现。
09:53
It's been reviewed many times. There's been thousands of trials,
192
593237
2662
人们探讨过它许多次,经过了数千次的验证,
09:55
hundreds of studies, and there's some really compelling findings.
193
595899
3981
和数百次的研究,也得出了一些让人折服的发现。
09:59
The first is, we're really bad at detecting deception,
194
599880
3236
首先,我们真的不擅长辨识谎言。
10:03
really bad. Fifty-four percent accuracy on average when you have to tell
195
603116
4116
非常不擅长。如果你要判断人们说的是不是谎话,
10:07
if somebody that just said a statement is lying or not.
196
607232
3384
平均准确率只有54%。
10:10
That's really bad. Why is it so bad?
197
610616
3192
真的是很糟糕。为什么会这么糟糕?
10:13
Well it has to do with Pinocchio's nose.
198
613808
2530
好,可能和匹诺曹的鼻子有些关系。
10:16
If I were to ask you guys, what do you rely on
199
616338
2359
如果我问你,你靠什么判断呢,
10:18
when you're looking at somebody and you want to find out
200
618697
2245
当你通过看着别人并想判断
10:20
if they're lying? What cue do you pay attention to?
201
620942
2930
别人是否在撒谎?你会关注什么线索?
10:23
Most of you would say that one of the cues you look at
202
623872
2430
你们大多数人说你们看别人找线索的时候,
10:26
is the eyes. The eyes are the window to the soul.
203
626302
2728
会看眼睛。因为眼睛是灵魂之窗。
10:29
And you're not alone. Around the world, almost every culture,
204
629030
2403
这么想的不止你一个。全世界,几乎每一种文化,
10:31
one of the top cues is eyes. But the research
205
631433
2863
首要的线索之一都是眼睛。但是,
10:34
over the last 50 years says there's actually no reliable cue
206
634296
3824
过去50年的研究表明,并没有真正可靠的线索
10:38
to deception, which blew me away, and it's one of
207
638120
2997
来判断人们是否撒谎,真让我非常意外,
10:41
the hard lessons that I learned when I was customs officer.
208
641117
2355
当我还是海关人员的时候,我也有过这样深刻的教训。
10:43
The eyes do not tell us whether somebody's lying or not.
209
643472
2430
看眼睛不会让你知道那个人有没撒谎。
10:45
Some situations, yes -- high stakes, maybe their pupils dilate,
210
645902
3018
有的情况是可以靠眼睛判断,高风险的时候,也许他们的瞳孔会放大,
10:48
their pitch goes up, their body movements change a little bit,
211
648920
3504
音调会升高,身体动作会有一点变化,
10:52
but not all the time, not for everybody, it's not reliable.
212
652424
4832
但不是所有情况都如此,不是所有人都一样,所以是不可靠的。
10:57
Strange. The other thing is that just because you can't see me
213
657256
3378
奇怪吧。其次是,仅仅因为你看不到我
11:00
doesn't mean I'm going to lie. It's common sense,
214
660634
2419
也不表明我就会撒谎。这是常识。
11:03
but one important finding is that we lie for a reason.
215
663053
2907
但一个很重要的发现是,我们撒谎都是有某种原因的。
11:05
We lie to protect ourselves or for our own gain
216
665960
2367
我们为了保护自己或为了自身的利益,
11:08
or for somebody else's gain.
217
668327
2827
或他人的利益而撒谎。
11:11
So there are some pathological liars, but they make up
218
671154
1930
有一些人病理性撒谎,但他们只占
11:13
a tiny portion of the population. We lie for a reason.
219
673084
3513
人口总数的小部分。我们是为了某种原因撒谎。
11:16
Just because people can't see us doesn't mean
220
676597
1631
仅仅因为别人看不到我们不意味着
11:18
we're going to necessarily lie.
221
678228
2271
我们必须撒谎。
11:20
But I think there's actually something much more
222
680499
1553
但是我认为,事实上应有一些
11:22
interesting and fundamental going on here. The next big
223
682052
3274
更有趣,更本质的原因。
11:25
thing for me, the next big idea, we can find by going
224
685326
3797
我下一个宏大构想,就是通过
11:29
way back in history to the origins of language.
225
689123
3139
追溯语言的起源来寻找答案。
11:32
Most linguists agree that we started speaking somewhere
226
692262
3887
大部分的语言学家都认为我们大概于
11:36
between 50,000 and 100,000 years ago. That's a long time ago.
227
696149
3168
五至十万年前开始出现语言。那是很久很久以前。
11:39
A lot of humans have lived since then.
228
699317
2616
之后存活过很多人。
11:41
We've been talking, I guess, about fires and caves
229
701933
2423
人们交谈,我猜想,谈论火,洞穴,
11:44
and saber-toothed tigers. I don't know what they talked about,
230
704356
3107
甚至剑齿虎。我不知道他们究竟谈过什么。
11:47
but they were doing a lot of talking, and like I said,
231
707463
2518
但人们谈论了许多,正如我说
11:49
there's a lot of humans evolving speaking,
232
709981
2545
人与人的交流在不断地演变。
11:52
about 100 billion people in fact.
233
712526
2806
事实上有大概有一千亿人参与其中。
11:55
What's important though is that writing only emerged
234
715332
2782
关键点在于,书写只有在
11:58
about 5,000 years ago. So what that means is that
235
718114
3587
五千年前才开始出现。这就意味着
12:01
all the people before there was any writing,
236
721701
2392
在人类开始学会记录之前,
12:04
every word that they ever said, every utterance
237
724093
5586
人们说过的每个词,每句话,
12:09
disappeared. No trace. Evanescent. Gone.
238
729679
4752
都消失了,没有踪影,一去不返了。
12:14
So we've been evolving to talk in a way in which
239
734431
4065
所以我们说话的方式一直在演变,
12:18
there is no record. In fact, even the next big change
240
738496
5917
但是却没有任何记录。事实上,尽管给书写带来巨大变化
12:24
to writing was only 500 years ago now,
241
744413
2468
距今也已有是500年,
12:26
with the printing press, which is very recent in our past,
242
746881
2379
然而随着印刷媒体的出现,
12:29
and literacy rates remained incredibly low right up until World War II,
243
749260
4242
直到第二次世界大战,人们的识字率仍然非常低,
12:33
so even the people of the last two millennia,
244
753502
3384
因此,即使是过去2000年的人们,
12:36
most of the words they ever said -- poof! -- disappeared.
245
756886
5032
他们说过的大多数话语 ——呼!消失了。
12:41
Let's turn to now, the networked age.
246
761918
3591
现在我们回到现在带网络时代。
12:45
How many of you have recorded something today?
247
765509
4712
你们中有多少人今天记录了什么?
12:50
Anybody do any writing today? Did anybody write a word?
248
770221
3177
有谁写了点东西?有谁写过一个字吗?
12:53
It looks like almost every single person here recorded something.
249
773398
4226
似乎几乎在座的每一位都记录了些什么。
12:57
In this room, right now, we've probably recorded more
250
777624
3048
在这屋子里,我们记录的东西也许
13:00
than almost all of human pre-ancient history.
251
780672
4542
几乎比所有人类史前历史记录的还要多。
13:05
That is crazy. We're entering this amazing period
252
785214
3230
真的是疯了。我们进入了人类进化洪流的美妙时期,
13:08
of flux in human evolution where we've evolved to speak
253
788444
4015
我们已经进化到
13:12
in a way in which our words disappear, but we're in
254
792459
2701
说话渐渐不需要言语,
13:15
an environment where we're recording everything.
255
795160
2903
而我们已经身处在一个一直在记录事情的环境中。
13:18
In fact, I think in the very near future, it's not just
256
798063
2337
事实上,我认为,在不久的将来,
13:20
what we write that will be recorded, everything we do
257
800400
2349
我们不止会记录我们写的东西,我们所做的一切
13:22
will be recorded.
258
802749
2333
都会被记录下来。
13:25
What does that mean? What's the next big idea from that?
259
805082
4456
那是什么意思?下一个宏大构想是什么?
13:29
Well, as a social scientist, this is the most amazing thing
260
809538
4250
作为一个社会科学家,
13:33
I have ever even dreamed of. Now, I can look at
261
813788
3547
我所梦想过最奇妙的事,就是现在我可以
13:37
all those words that used to, for millennia, disappear.
262
817335
3611
研究那些几千年前曾经被用过最终消失的文字。
13:40
I can look at lies that before were said and then gone.
263
820946
4248
我可以研究曾经出现又消失的谎言。
13:45
You remember those Astroturfing reviews that we were
264
825194
3520
你还记得我们之前谈论高的那些造势评论吗?
13:48
talking about before? Well, when they write a fake review,
265
828714
3503
当人们编造虚假评论时,
13:52
they have to post it somewhere, and it's left behind for us.
266
832217
2704
他们要在某个地方发表,这样就给我们留下了记录。
13:54
So one thing that we did, and I'll give you an example of
267
834921
2435
我来给你们举个例子,
13:57
looking at the language, is we paid people
268
837356
2495
看看我们是如何观察语言的,
13:59
to write some fake reviews. One of these reviews is fake.
269
839851
3535
我们付钱让人编造一些假评论。这些评论中有一篇是假的。
14:03
The person never was at the James Hotel.
270
843386
1943
其中一个人从没有入住过詹姆斯酒店。
14:05
The other review is real. The person stayed there.
271
845329
2922
另一个评论是真的。这个人真的在那住过。
14:08
Now, your task now is to decide
272
848251
3527
你们现在要决定
14:11
which review is fake?
273
851778
4073
哪一篇评论是假的?
14:15
I'll give you a moment to read through them.
274
855851
4186
大家花点时间读一读。
14:20
But I want everybody to raise their hand at some point.
275
860037
2287
到某个时候,我想让每个人举手表态。
14:22
Remember, I study deception. I can tell if you don't raise your hand.
276
862324
4231
记住,我研究骗局。即使你不做选择我也能看出来。
14:26
All right, how many of you believe that A is the fake?
277
866555
4570
好,你们有多少人认为A是假的?
14:33
All right. Very good. About half.
278
873154
2142
好,非常好。大概一半人。
14:35
And how many of you think that B is?
279
875296
3615
多少人认为B是假的?
14:38
All right. Slightly more for B.
280
878911
2529
好,选B的稍微多一些。
14:41
Excellent. Here's the answer.
281
881440
2592
很好。这是答案。
14:44
B is a fake. Well done second group. You dominated the first group. (Laughter)
282
884032
6581
B 是假的。干得好,第二组。你们赢了第一组。(笑声)
14:50
You're actually a little bit unusual. Every time we demonstrate this,
283
890613
2846
你们有点不寻常。我们每次展示这个的时候,
14:53
it's usually about a 50-50 split, which fits
284
893459
2746
通常都是50-50对半,
14:56
with the research, 54 percent. Maybe people here
285
896205
2646
这符合我们的研究结果,54%。
14:58
in Winnipeg are more suspicious and better at figuring it out.
286
898851
3770
也许在温伯尔德的各位更谨慎,也更能找出答案。
15:02
Those cold, hard winters, I love it.
287
902621
2688
我喜欢这里严寒的冬天。
15:05
All right, so why do I care about this?
288
905309
3054
好吧,那么为什么我关心这个结果?
15:08
Well, what I can do now with my colleagues in computer science
289
908363
3268
我和我的同事运用电脑科学
15:11
is we can create computer algorithms that can analyze
290
911631
3232
设计电脑计算法来分析
15:14
the linguistic traces of deception.
291
914863
2900
骗局的文字痕迹。
15:17
Let me highlight a couple of things here
292
917763
1833
让我在此重点讲几点我们从虚假评论找到的东西
15:19
in the fake review. The first is that liars tend to think
293
919596
3443
首先,说谎的人趋向于
15:23
about narrative. They make up a story:
294
923039
1588
着重叙事性。他们会编造故事:
15:24
Who? And what happened? And that's what happened here.
295
924627
3186
都有谁?发生了什么事?这里就是如此。
15:27
Our fake reviewers talked about who they were with
296
927813
2289
我们的假评人谈到他们和谁在一起,
15:30
and what they were doing. They also used the first person singular, I,
297
930102
4765
在做什么。他们也会用第一人称,”我“
15:34
way more than the people that actually stayed there.
298
934867
2469
远远多于真正在那里住过的人。
15:37
They were inserting themselves into the hotel review,
299
937336
4696
他们会把自己强加到酒店评论里面,
15:42
kind of trying to convince you they were there.
300
942032
1696
想试图说服你们,好像他们真的在那里过。
15:43
In contrast, the people that wrote the reviews that were actually there,
301
943728
4015
相反,那些真实入住过的人写评论时,
15:47
their bodies actually entered the physical space,
302
947743
2432
因为他们的身体真的进入过那个物理空间,
15:50
they talked a lot more about spatial information.
303
950175
2899
他们会谈更多空间方面的信息。
15:53
They said how big the bathroom was, or they said,
304
953074
2517
他们会说浴室有多大,或者他们会说,
15:55
you know, here's how far shopping is from the hotel.
305
955591
4520
购物的地方离酒店有多远。
16:00
Now, you guys did pretty well. Most people perform at chance at this task.
306
960111
4161
所以,你们做得很好。大多数人只会随机选择。
16:04
Our computer algorithm is very accurate, much more accurate
307
964272
2758
我们的电脑计算法是非常准确的,
16:07
than humans can be, and it's not going to be accurate all the time.
308
967030
3291
远远准确于人脑,但它也不会一直都是对的。
16:10
This isn't a deception-detection machine to tell
309
970321
2030
这不是一部测谎机
16:12
if your girlfriend's lying to you on text messaging.
310
972351
2501
用来判断你的女朋友有没有在你们的短信上撒谎。
16:14
We believe that every lie now, every type of lie --
311
974852
3564
我们相信现在所有的谎言,所有类型的谎言——
16:18
fake hotel reviews, fake shoe reviews,
312
978416
3787
假的酒店评论,假的鞋子评论,
16:22
your girlfriend cheating on you with text messaging --
313
982203
2914
你的女朋友对你不忠并用信息欺骗你——
16:25
those are all different lies. They're going to have
314
985117
1505
那些都是不同类型的谎言。
16:26
different patterns of language. But because everything's
315
986622
2859
它们有不同的语言模式。但因为现在
16:29
recorded now, we can look at all of those kinds of lies.
316
989481
4689
一切东西都被记录下来了,我们可以分析各种各样的谎言。
16:34
Now, as I said, as a social scientist, this is wonderful.
317
994170
3993
正如我所说,作为一个社会科学家,这棒极了。
16:38
It's transformational. We're going to be able to learn
318
998163
2087
具有变革意义。我们可以
16:40
so much more about human thought and expression,
319
1000250
3802
更多地了解人类的想法和表达,
16:44
about everything from love to attitudes,
320
1004052
4398
从爱到态度等一切事情,
16:48
because everything is being recorded now, but
321
1008450
1960
因为现在一切都被记录着,但是
16:50
what does it mean for the average citizen?
322
1010410
2404
这对普通人有什么意义?
16:52
What does it mean for us in our lives?
323
1012814
2802
对我们的生活有什么意义?
16:55
Well, let's forget deception for a bit. One of the big ideas,
324
1015616
3673
我们暂时不谈骗局。谈谈我的宏大构想的其中一个,
16:59
I believe, is that we're leaving these huge traces behind.
325
1019289
3688
我们留下了巨大的痕迹。
17:02
My outbox for email is massive,
326
1022977
3216
我的电邮发信箱有大量邮件信息,
17:06
and I never look at it. I write all the time,
327
1026193
3337
但我从来不看。我一直在写东西,
17:09
but I never look at my record, at my trace.
328
1029530
3438
但我从来不看我的记录,我留下的痕迹。
17:12
And I think we're going to see a lot more of that,
329
1032968
1567
我想,我们会对自己有更多地了解,
17:14
where we can reflect on who we are by looking at
330
1034535
3161
通过看看我们写过什么,说过什么,做过什么,
17:17
what we wrote, what we said, what we did.
331
1037696
3618
我们更好地认识自己。
17:21
Now, if we bring it back to deception, there's a couple
332
1041314
2272
如果我们现在再看看骗局这个问题,
17:23
of take-away things here.
333
1043586
1977
我简单谈谈几点。
17:25
First, lying online can be very dangerous, right?
334
1045563
4488
首先,在线撒谎可能会很危险不是吗?
17:30
Not only are you leaving a record for yourself on your machine,
335
1050051
2706
你不光在自己的电脑留下记录,
17:32
but you're leaving a record on the person that you were lying to,
336
1052757
4275
你也在被撒谎对象那里留下记录,
17:37
and you're also leaving them around for me to analyze
337
1057032
1760
你还留下记录让我作分析。
17:38
with some computer algorithms.
338
1058792
1454
让我用电脑作运算。
17:40
So by all means, go ahead and do that, that's good.
339
1060246
3173
所以,请继续,尽管撒谎,好极了。
17:43
But when it comes to lying and what we want to do
340
1063419
4154
当我们谈到撒谎和我们的生活,
17:47
with our lives, I think we can go back to
341
1067573
2553
我想我们可以回到
17:50
Diogenes and Confucius. And they were less concerned
342
1070126
3749
第欧根尼和孔子身上。他们很少考虑
17:53
about whether to lie or not to lie, and more concerned about
343
1073875
2832
去撒谎或不去撒谎,他们更关注
17:56
being true to the self, and I think this is really important.
344
1076707
3285
要忠于自己,我想这是非常重要的。
17:59
Now, when you are about to say or do something,
345
1079992
4183
当你准备说什么或做什么,
18:04
we can think, do I want this to be part of my legacy,
346
1084175
4560
我们可以这么想,我想让它成为我的遗迹的一部分吗,
18:08
part of my personal record?
347
1088735
2713
想让它成为我个人记录的一部分吗?
18:11
Because in the digital age we live in now,
348
1091448
2657
因为在我们生活的这个数码时代,
18:14
in the networked age, we are all leaving a record.
349
1094105
4464
网络时代,我们一直都在留下记录。
18:18
Thank you so much for your time,
350
1098569
1695
谢谢你们的聆听,
18:20
and good luck with your record. (Applause)
351
1100264
4447
祝你们记录愉快。(掌声)
关于本网站

这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。

https://forms.gle/WvT1wiN1qDtmnspy7