Jeff Hancock: 3 types of (digital) lies

91,279 views ・ 2012-11-09

TED


請雙擊下方英文字幕播放視頻。

00:00
Translator: Joseph Geni Reviewer: Morton Bast
0
0
7000
譯者: Cheng An Yang 審譯者: Regina Chu
00:15
Let me tell you, it has been a fantastic month for deception.
1
15641
3713
告訴你們 這個月真是絕妙的欺騙之月啊
00:19
And I'm not even talking about the American presidential race. (Laughter)
2
19354
4253
美國總統大選甚至還不在我討論的範圍內 (笑)
00:23
We have a high-profile journalist caught for plagiarism,
3
23607
4335
有位高知名度的新聞記者被指剽竊
00:27
a young superstar writer whose book involves
4
27942
2932
一位年輕明星作家作品裡
00:30
so many made up quotes that they've pulled it from the shelves;
5
30874
3305
有太多虛構的引言 以至於被下架
00:34
a New York Times exposé on fake book reviews.
6
34179
2598
而紐約時報被揭發有偽造的書評
00:36
It's been fantastic.
7
36777
1409
簡直太棒了!
00:38
Now, of course, not all deception hits the news.
8
38186
3851
當然 並非所有的詐騙都會上新聞
00:42
Much of the deception is everyday. In fact, a lot of research
9
42037
3679
有許多欺騙是日常性的 事實上 有許多研究
00:45
shows that we all lie once or twice a day, as Dave suggested.
10
45716
4331
顯示 每天我們都會說謊一到兩次 就像大衛(Dave)說的
00:50
So it's about 6:30 now, suggests that most of us should have lied.
11
50047
2933
現在約是六點半 照理來說我們之中大部份都已經說過謊了
00:52
Let's take a look at Winnipeg. How many of you,
12
52980
1900
讓我們瞭解一下現場有多少人
00:54
in the last 24 hours -- think back -- have told a little fib,
13
54880
2927
回想一下 在過去的二十四小時之內 說過小小的謊
00:57
or a big one? How many have told a little lie out there?
14
57807
3551
或者是大謊? 現場有多少人扯過小小的謊?
01:01
All right, good. These are all the liars.
15
61358
1904
很好 這些人都是騙子
01:03
Make sure you pay attention to them. (Laughter)
16
63262
3293
其他人要小心他們 (笑)
01:06
No, that looked good, it was about two thirds of you.
17
66555
2146
開玩笑的 剛剛有差不多三分之二的人承認自己說謊
01:08
The other third didn't lie, or perhaps forgot,
18
68701
2852
剩下的三分之一 可能是忘了自己說過謊
01:11
or you're lying to me about your lying, which is very,
19
71553
2660
或者是在這件事上你們騙我 這樣做是非常
01:14
very devious. (Laughter) This fits with a lot of the research,
20
74213
4050
非常不誠實的做法 (笑) 但與很多研究相符合
01:18
which suggests that lying is very pervasive.
21
78263
3354
研究顯示說謊是非常普遍的
01:21
It's this pervasiveness, combined with the centrality
22
81617
3961
也就是這種普遍的性格 與以自我為中心結合
01:25
to what it means to be a human, the fact that we can
23
85578
2440
造就了人性 也是為何我們可以
01:28
tell the truth or make something up,
24
88018
1880
說出事實 或者編成謊言
01:29
that has fascinated people throughout history.
25
89898
2851
而這種行為一直以來都使人著迷於研究
01:32
Here we have Diogenes with his lantern.
26
92749
2629
這是錫諾普的第歐根尼 (古希臘哲學家) 與他的燈籠
01:35
Does anybody know what he was looking for?
27
95378
2680
有人知道他在搜尋什麼嗎?
01:38
A single honest man, and he died without finding one
28
98058
3784
一個誠實的人 但他走遍希臘 直到死前
01:41
back in Greece. And we have Confucius in the East
29
101842
3017
都沒有找到一人 另外一位則是東方哲學家孔子
01:44
who was really concerned with sincerity,
30
104859
2377
他當時相當重視真誠的行為
01:47
not only that you walked the walk or talked the talk,
31
107236
3084
不單只是"言出必行"
01:50
but that you believed in what you were doing.
32
110320
3154
而是人應該對自己所做的事情有信念
01:53
You believed in your principles.
33
113474
2006
會相信自己的原則
01:55
Now my first professional encounter with deception
34
115480
2931
我職業生涯第一次碰到欺騙
01:58
is a little bit later than these guys, a couple thousand years.
35
118411
3463
稍微比這些人晚一點 大概幾千年後
02:01
I was a customs officer for Canada back in the mid-'90s.
36
121874
3799
在90年代中期 我曾是加拿大的海關
02:05
Yeah. I was defending Canada's borders.
37
125673
2826
沒錯 我當時正守衛著加拿大的邊境
02:08
You may think that's a weapon right there. In fact,
38
128499
3782
你們應該會認為我手上拿著個武器 事實上
02:12
that's a stamp. I used a stamp to defend Canada's borders. (Laughter)
39
132281
5030
我只有一個印章 我用印章來保衛加拿大的邊境 (笑)
02:17
Very Canadian of me. I learned a lot about deception
40
137311
3537
非常加拿大人的行為 我學習到許多關於欺騙的事情
02:20
while doing my duty here in customs,
41
140848
3055
當我在海關執勤時
02:23
one of which was that most of what I thought I knew about deception was wrong,
42
143903
2884
我學到的一件事就是 我對欺騙大部份的認知都是錯的
02:26
and I'll tell you about some of that tonight.
43
146787
1752
今晚我會跟大家分享
02:28
But even since just 1995, '96, the way we communicate
44
148539
4074
從1995, 96年開始 我們溝通的方式
02:32
has been completely transformed. We email, we text,
45
152613
3297
已經完全地轉變 我們收發電子郵件 傳短訊
02:35
we skype, we Facebook. It's insane.
46
155910
2613
SKYPE 或者是FACEBOOK 真是瘋狂
02:38
Almost every aspect of human communication's been changed,
47
158523
3261
幾乎所有人類溝通的各個層面都已經改變
02:41
and of course that's had an impact on deception.
48
161784
2560
當然 這對欺騙也有很大的影響
02:44
Let me tell you a little bit about a couple of new deceptions
49
164344
2583
我來跟你們說一些關於新詐欺的方式
02:46
we've been tracking and documenting.
50
166927
2376
我們一直在追蹤並紀錄那些行為
02:49
They're called the Butler, the Sock Puppet
51
169303
4244
他們被稱之為"總管" "傀儡"
02:53
and the Chinese Water Army.
52
173547
2081
以及"中國水軍"
02:55
It sounds a little bit like a weird book,
53
175628
1897
聽起來有點像是本詭異的書
02:57
but actually they're all new types of lies.
54
177525
2133
但實際上他們都是新式的謊言
02:59
Let's start with the Butlers. Here's an example of one:
55
179658
3045
我們先從"總管"開始 其中一個例子是
03:02
"On my way." Anybody ever written, "On my way?"
56
182703
3113
"我在路上了" 有人曾經寫過"我在路上了"嗎?
03:05
Then you've also lied. (Laughter)
57
185816
3763
你們也撒謊了 (笑)
03:09
We're never on our way. We're thinking about going on our way.
58
189579
4197
我們才沒有“在路上” 只是說自己在路上
03:13
Here's another one: "Sorry I didn't respond to you earlier.
59
193776
2763
還有另外一個例子 "抱歉 我無法早些回覆你
03:16
My battery was dead." Your battery wasn't dead.
60
196539
1965
我電池沒電了" 實際上你的電池並非沒電
03:18
You weren't in a dead zone.
61
198504
1876
也不是收訊不良
03:20
You just didn't want to respond to that person that time.
62
200380
1953
你只是當時不想回覆那個人
03:22
Here's the last one: You're talking to somebody,
63
202333
1797
最後一個例子是 你正在跟某人聊天
03:24
and you say, "Sorry, got work, gotta go."
64
204130
2490
然後說 "抱歉 我得上班要先離開了"
03:26
But really, you're just bored. You want to talk to somebody else.
65
206620
3797
但實際上 你只是覺得無趣 想跟其他人聊
03:30
Each of these is about a relationship,
66
210417
2416
以上每個例子都和人際關係有關
03:32
and this is a 24/7 connected world. Once you get my cell phone number,
67
212833
4405
這是個無時無刻都緊密牽連的世界 一旦你有了我的手機號碼
03:37
you can literally be in touch with me 24 hours a day.
68
217238
2965
你的確可以在任何時候與我聯繫
03:40
And so these lies are being used by people
69
220203
2369
這些則是一般人可能會說的小謊
03:42
to create a buffer, like the butler used to do,
70
222572
2826
就像以往家中總管所做的 在我們與其他人之間
03:45
between us and the connections to everybody else.
71
225398
3407
建立一點緩衝
03:48
But they're very special. They use ambiguity
72
228805
1707
這些行為的特別之處在於他們利用科技
03:50
that comes from using technology. You don't know
73
230512
2061
產生含糊的回答 因為你並不知道
03:52
where I am or what I'm doing or who I'm with.
74
232573
2948
我在哪裡 我在做什麼 或者我跟誰在一起
03:55
And they're aimed at protecting the relationships.
75
235521
2491
且只是單純地想要保護這段關係
03:58
These aren't just people being jerks. These are people
76
238012
2581
而非故意惹人厭 人們這時候
04:00
that are saying, look, I don't want to talk to you now,
77
240593
2376
是想表達 我現在不太想跟你說話
04:02
or I didn't want to talk to you then, but I still care about you.
78
242969
2424
或者我當時並不想跟你說話 但我還是很在乎你
04:05
Our relationship is still important.
79
245393
2400
我們的關係仍然重要
04:07
Now, the Sock Puppet, on the other hand,
80
247793
1514
然而"傀儡"
04:09
is a totally different animal. The sock puppet isn't
81
249307
2343
則完全不同 "傀儡"
04:11
about ambiguity, per se. It's about identity.
82
251650
3065
就其本身而言 與模棱兩可毫不相關 而它是關於身份
04:14
Let me give you a very recent example,
83
254715
2002
讓我給你一個最近的例子
04:16
as in, like, last week.
84
256717
1514
大約是上星期發生的
04:18
Here's R.J. Ellory, best-seller author in Britain.
85
258231
3268
埃洛里 (R.J. Ellory) 是英國暢銷犯罪小說作家
04:21
Here's one of his bestselling books.
86
261499
2020
這是他其中一本暢銷書
04:23
Here's a reviewer online, on Amazon.
87
263519
3413
在亞馬遜購物網站上有位評論者
04:26
My favorite, by Nicodemus Jones, is,
88
266932
2657
Nicodemus Jones (人名) 寫了一段話 我很愛 他說
04:29
"Whatever else it might do, it will touch your soul."
89
269589
3808
"無論這本書對你有什麼其他影響 它都會碰觸到你內心深處"
04:33
And of course, you might suspect
90
273397
1403
當然 你會懷疑
04:34
that Nicodemus Jones is R.J. Ellory.
91
274800
2627
Nicodemus Jones (人名) 就是埃洛里
04:37
He wrote very, very positive reviews about himself. Surprise, surprise.
92
277427
4687
他給了自己的書給了很多非常正面的評價 而那一點也不意外
04:42
Now this Sock Puppet stuff isn't actually that new.
93
282114
3260
其實"傀儡"並非現在才有
04:45
Walt Whitman also did this back in the day,
94
285374
3167
寫草葉集的惠特曼以前也做過類似的事
04:48
before there was Internet technology. Sock Puppet
95
288541
3055
遠在網路科技之前
04:51
becomes interesting when we get to scale,
96
291596
2768
"傀儡"有趣的是 規模一擴大
04:54
which is the domain of the Chinese Water Army.
97
294364
2518
那就是"中國海軍"的領域了
04:56
Chinese Water Army refers to thousands of people
98
296882
2436
"中國海軍"指的是成千上萬的群眾
04:59
in China that are paid small amounts of money
99
299318
3048
在中國他們收取小額度的費用
05:02
to produce content. It could be reviews. It could be
100
302366
3034
在留言提供消息 可能是在評論中 也可能是
05:05
propaganda. The government hires these people,
101
305400
2559
宣傳裡 政府僱用這些人
05:07
companies hire them, all over the place.
102
307959
2628
公司也僱用這些人 到處都有這種做法
05:10
In North America, we call this Astroturfing,
103
310587
3617
在北美 我們稱之為 "Astroturfing" 偽草根運動
05:14
and Astroturfing is very common now. There's a lot of concerns about it.
104
314204
3438
偽草根運動現在非常普遍 也有很多不同的手法
05:17
We see this especially with product reviews, book reviews,
105
317642
3227
特別會在產品的評價 書的評論中
05:20
everything from hotels to whether that toaster is a good toaster or not.
106
320869
4795
從飯店的討論到吐司的口感都有
05:25
Now, looking at these three reviews, or these three types of deception,
107
325664
3918
現在 我們來看看這三種評論 或者稱為三種詐欺方式
05:29
you might think, wow, the Internet is really making us
108
329582
2737
你可能會想說 網路真的讓我們變成
05:32
a deceptive species, especially when you think about
109
332319
3209
一種欺騙的生物 特別是當你提到
05:35
the Astroturfing, where we can see deception brought up to scale.
110
335528
4602
偽草根運動 我們可以觀察到規模很大的欺騙行為
05:40
But actually, what I've been finding is very different from that.
111
340130
4738
但實際上我研究的結果卻發現並非如此
05:44
Now, let's put aside the online anonymous sex chatrooms,
112
344868
3249
我們先不要討論線上匿名18禁聊天室
05:48
which I'm sure none of you have been in.
113
348117
1899
我相信你們都未曾使用過
05:50
I can assure you there's deception there.
114
350016
2329
我可以向你們保證裡面一定有不實的對答
05:52
And let's put aside the Nigerian prince who's emailed you
115
352345
2709
也先不要考慮 你收到一封電郵說奈吉利亞的王子
05:55
about getting the 43 million out of the country. (Laughter)
116
355054
3228
要將四千三百萬美金運出國家 (笑)
05:58
Let's forget about that guy, too.
117
358282
1680
先不要把這些包括在內
05:59
Let's focus on the conversations between our friends
118
359962
2944
我們先專注存在於朋友之間
06:02
and our family and our coworkers and our loved ones.
119
362906
2147
家人 同事 和我們所愛的人中的對話
06:05
Those are the conversations that really matter.
120
365053
2408
這些才是真正重要的對話
06:07
What does technology do to deception with those folks?
121
367461
4240
而科技會怎麼影響我們欺騙這些人呢?
06:11
Here's a couple of studies. One of the studies we do
122
371701
3075
現有有幾個研究 而我們其中一個研究報告
06:14
are called diary studies, in which we ask people to record
123
374776
3371
稱之為"日記型調查" 我們要求受試者全程紀錄
06:18
all of their conversations and all of their lies for seven days,
124
378147
3566
七天內所有對話 包括所有謊言
06:21
and what we can do then is calculate how many lies took place
125
381713
3105
而我們所做的事就是計算 限於某個媒介
06:24
per conversation within a medium, and the finding
126
384818
2948
他們的對話裡有多少謊話 且我們發現
06:27
that we get that surprises people the most is that email
127
387766
2524
最令人意外的是 電郵
06:30
is the most honest of those three media.
128
390290
3279
在三種媒介中可以說是人們表達最誠實的一種
06:33
And it really throws people for a loop because we think,
129
393569
2401
而這結果真的令人震撼 因為我們以為
06:35
well, there's no nonverbal cues, so why don't you lie more?
130
395970
3736
既然沒有非語言暗示 你為什麼不撒更多謊呢?
06:39
The phone, in contrast, the most lies.
131
399706
4304
相反地 人們在電話中說更多謊
06:44
Again and again and again we see the phone is the device
132
404010
1946
我們不斷地發現 電話是
06:45
that people lie on the most, and perhaps because of the Butler Lie ambiguities I was telling you about.
133
405956
4718
人們最常使用來說謊的器材 或許是因為 剛才跟你們說到的"管家式謊言"裡的模棱兩可所造成的
06:50
This tends to be very different from what people expect.
134
410674
3975
這與一般人們所想的非常不同
06:54
What about résumés? We did a study in which we had
135
414649
3224
那麼履歷呢? 我們其中一個研究要求
06:57
people apply for a job, and they could apply for a job
136
417873
2544
受試者申請工作 他們可以透過
07:00
either with a traditional paper résumé, or on LinkedIn,
137
420417
3514
用傳統的紙本履歷 或者是Linkedln申請
07:03
which is a social networking site like Facebook,
138
423931
2822
這個是像是臉書的社交網站
07:06
but for professionals -- involves the same information as a résumé.
139
426753
3567
但是更專業 裡面的資訊就如同履歷一樣
07:10
And what we found, to many people's surprise,
140
430320
2614
我們的發現 出乎意外
07:12
was that those LinkedIn résumés were more honest
141
432934
2795
在Linkedln的履歷相較而言更貼近事實
07:15
on the things that mattered to employers, like your
142
435729
1824
尤其是對於僱主比較看重的事情上 像是
07:17
responsibilities or your skills at your previous job.
143
437553
4151
之前工作上的責任或者是能力
07:21
How about Facebook itself?
144
441704
2296
那臉書呢?
07:24
You know, we always think that hey, there are these
145
444000
1882
你知道的 通常我們會覺得 那些應該是
07:25
idealized versions, people are just showing the best things
146
445882
2129
理想化的版本 大家都是把自己生活中最好的一面
07:28
that happened in their lives. I've thought that many times.
147
448011
2656
展現出來 我自己也想過很多次
07:30
My friends, no way they can be that cool and have good of a life.
148
450667
3068
我的朋友們怎麼可能這麼酷 還能保有美好的生活
07:33
Well, one study tested this by examining people's personalities.
149
453735
3821
有研究藉由檢視人們的個性來了解臉書上資訊真實性
07:37
They had four good friends of a person judge their personality.
150
457556
4218
他們找來某個人的四個好友來評斷
07:41
Then they had strangers, many strangers,
151
461774
1956
另外也找來陌生人 很多不認識他的人
07:43
judge the person's personality just from Facebook,
152
463730
2528
一起來透過臉書判斷那個人的個性
07:46
and what they found was those judgments of the personality
153
466258
2429
而他們發現那些關於個性的評論
07:48
were pretty much identical, highly correlated,
154
468687
2509
其實非常相似 相互有關聯的
07:51
meaning that Facebook profiles really do reflect our actual personality.
155
471196
4373
這代表臉書的人物側寫 是真的可以反應我們真實的個性
07:55
All right, well, what about online dating?
156
475569
2572
那麼線上交友網站呢?
07:58
I mean, that's a pretty deceptive space.
157
478141
1500
我的意思是 那應該會有很多不真實的資訊吧
07:59
I'm sure you all have "friends" that have used online dating. (Laughter)
158
479641
3535
我相信你們都有 "朋友" 使用過線上交友 (笑)
08:03
And they would tell you about that guy that had no hair
159
483176
2058
他們會跟你說 某個男人出現的時候沒有頭髮
08:05
when he came, or the woman that didn't look at all like her photo.
160
485234
3030
或者是 那個女人跟照片裡一點也不一樣
08:08
Well, we were really interested in it, and so what we did
161
488264
3136
所以對於此我們非常有興趣 所以我們
08:11
is we brought people, online daters, into the lab,
162
491400
3107
邀請參加線上交友的人到實驗室
08:14
and then we measured them. We got their height
163
494507
1480
然後我們開始測量他們 我們讓他們
08:15
up against the wall, we put them on a scale, got their weight --
164
495987
3881
靠著牆測量身高 然後紀錄他們的體重--
08:19
ladies loved that -- and then we actually got their driver's license to get their age.
165
499868
3895
那是女士們的最愛 -- 然後我們再從他們的駕照上得知年紀
08:23
And what we found was very, very interesting.
166
503763
4311
我們的研究結果是非常有趣的
08:28
Here's an example of the men and the height.
167
508074
3929
螢幕上面可以看到一個有關男性與身高的例子
08:32
Along the bottom is how tall they said they were in their profile.
168
512003
2470
水平軸顯示的是他們在個人資料上說自己有多高
08:34
Along the y-axis, the vertical axis, is how tall they actually were.
169
514473
4862
垂直軸則是他們實際的身高
08:39
That diagonal line is the truth line. If their dot's on it,
170
519335
3076
斜線則是實際線 如果點落在線上
08:42
they were telling exactly the truth.
171
522411
1554
代表這些人說的是實話
08:43
Now, as you see, most of the little dots are below the line.
172
523965
3113
你可以看到 大部分的點都在線下
08:47
What it means is all the guys were lying about their height.
173
527078
2867
代表這些男生在身高上面說謊
08:49
In fact, they lied about their height about nine tenths of an inch,
174
529945
2941
事實上 差別約是在2.286公分
08:52
what we say in the lab as "strong rounding up." (Laughter)
175
532886
6276
我們在實驗室中戲稱為"身高無條件進位法" (笑)
08:59
You get to 5'8" and one tenth, and boom! 5'9".
176
539162
4503
原來身高是173公分的人 突然間長高到了175公分
09:03
But what's really important here is, look at all those dots.
177
543665
1998
但重要的是 其實你們看這些點
09:05
They are clustering pretty close to the truth. What we found
178
545663
2566
都是相當接近事實 我們發現
09:08
was 80 percent of our participants did indeed lie
179
548229
2408
八成的受試者確實說了
09:10
on one of those dimensions, but they always lied by a little bit.
180
550637
3595
關於某個體格的謊 但是差別都只是一點點
09:14
One of the reasons is pretty simple. If you go to a date,
181
554232
3024
其中一個理由其實非常簡單 如果你去約會
09:17
a coffee date, and you're completely different than what you said,
182
557256
3601
例如一個在咖啡廳的約會 然後你和你所形容的完全不同
09:20
game over. Right? So people lied frequently, but they lied
183
560857
3619
那遊戲就結束了 對吧? 所以人們常常會說謊 但是只會
09:24
subtly, not too much. They were constrained.
184
564476
3469
說一點點謊 不會太離譜 有受到克制
09:27
Well, what explains all these studies? What explains the fact
185
567945
2887
這些研究解釋了些什麼?要如何解釋這個事實
09:30
that despite our intuitions, mine included,
186
570832
4635
就是儘管我們的直覺不相信這件事 我也在內
09:35
a lot of online communication, technologically-mediated
187
575467
3529
但許多的線上對話 以科技為媒介的
09:38
communication, is more honest than face to face?
188
578996
4028
溝通 其實比面對面來的真實?
09:43
That really is strange. How do we explain this?
189
583024
2489
這真的很奇怪 到底是什麼原因呢?
09:45
Well, to do that, one thing is we can look at the deception-detection literature.
190
585513
3379
要了解這個問題 我們可以先從偵測謊言的文獻開始
09:48
It's a very old literature by now, it's coming up on 50 years.
191
588892
4345
那是非常久之前的文獻 至少有50年的歷史
09:53
It's been reviewed many times. There's been thousands of trials,
192
593237
2662
已經過許多次重新審視 還有上千次的實驗
09:55
hundreds of studies, and there's some really compelling findings.
193
595899
3981
上百次的研究 其中有些非常令人信服的結論
09:59
The first is, we're really bad at detecting deception,
194
599880
3236
首先 人類並不擅長偵測謊言
10:03
really bad. Fifty-four percent accuracy on average when you have to tell
195
603116
4116
可以說是非常差 平均僅有54%的準確率判斷
10:07
if somebody that just said a statement is lying or not.
196
607232
3384
這個人是所說的是真的還是假的
10:10
That's really bad. Why is it so bad?
197
610616
3192
正確的比例是很低的 為什麼會如此難以判斷呢?
10:13
Well it has to do with Pinocchio's nose.
198
613808
2530
這就與小木偶的鼻子有關了
10:16
If I were to ask you guys, what do you rely on
199
616338
2359
如果我詢問你們 你們會以什麼作為判斷的標準
10:18
when you're looking at somebody and you want to find out
200
618697
2245
當你直視著某人 想知道這個人
10:20
if they're lying? What cue do you pay attention to?
201
620942
2930
是否在說謊? 你會注意什麼細節
10:23
Most of you would say that one of the cues you look at
202
623872
2430
大部分的人可能會說 其中一個要特別注意的
10:26
is the eyes. The eyes are the window to the soul.
203
626302
2728
就是眼睛 眼睛是靈魂之窗
10:29
And you're not alone. Around the world, almost every culture,
204
629030
2403
不單單只是你 幾乎全世界 各種文化
10:31
one of the top cues is eyes. But the research
205
631433
2863
都會說眼睛是最顯見的線索 但是近乎過去50年的研究
10:34
over the last 50 years says there's actually no reliable cue
206
634296
3824
顯示實際上並沒有任何線索可尋
10:38
to deception, which blew me away, and it's one of
207
638120
2997
來辨別真偽 這讓我超級震撼 而且這是其中一個
10:41
the hard lessons that I learned when I was customs officer.
208
641117
2355
我在擔任海關時最困難的課程
10:43
The eyes do not tell us whether somebody's lying or not.
209
643472
2430
眼神無法告訴我們這個人是在說謊或者不是
10:45
Some situations, yes -- high stakes, maybe their pupils dilate,
210
645902
3018
在某些情況下 是的 在高風險的時候 也許瞳孔會放大
10:48
their pitch goes up, their body movements change a little bit,
211
648920
3504
音調會升高 肢體動作會稍微改變
10:52
but not all the time, not for everybody, it's not reliable.
212
652424
4832
但並非一定 也不是所有人都如此 那是不可靠的
10:57
Strange. The other thing is that just because you can't see me
213
657256
3378
是不是很怪呢? 另一件是就算你沒看到我
11:00
doesn't mean I'm going to lie. It's common sense,
214
660634
2419
也不代表我就會說謊 那可以說是常識
11:03
but one important finding is that we lie for a reason.
215
663053
2907
但研究顯示撒謊是有原因的
11:05
We lie to protect ourselves or for our own gain
216
665960
2367
我們會為了保護自己或者是自身利益而說謊
11:08
or for somebody else's gain.
217
668327
2827
也可能為了其他人而說謊
11:11
So there are some pathological liars, but they make up
218
671154
1930
當然有人是慣性的騙子 但他們僅是
11:13
a tiny portion of the population. We lie for a reason.
219
673084
3513
少部分的人 大部分的人說謊是有原因的
11:16
Just because people can't see us doesn't mean
220
676597
1631
不會因為沒有面對面 就代表
11:18
we're going to necessarily lie.
221
678228
2271
我們一定會說謊
11:20
But I think there's actually something much more
222
680499
1553
但我認為其實背後有更深
11:22
interesting and fundamental going on here. The next big
223
682052
3274
更有趣的涵義在內 接下來 我認為有件特別的
11:25
thing for me, the next big idea, we can find by going
224
685326
3797
事情 一個特別的概念 我們可以透過
11:29
way back in history to the origins of language.
225
689123
3139
語言發展的歷史來了解
11:32
Most linguists agree that we started speaking somewhere
226
692262
3887
大部分的語言學家同意 人類開始說話是在
11:36
between 50,000 and 100,000 years ago. That's a long time ago.
227
696149
3168
五萬到十萬年前 那是很久很久以前
11:39
A lot of humans have lived since then.
228
699317
2616
人類是從那時候開始存在
11:41
We've been talking, I guess, about fires and caves
229
701933
2423
我猜話題應該是與火 洞穴
11:44
and saber-toothed tigers. I don't know what they talked about,
230
704356
3107
劍齒虎相關 其實我不知道他們當時說什麼
11:47
but they were doing a lot of talking, and like I said,
231
707463
2518
但他們一定有很多談話 就像我先前所說
11:49
there's a lot of humans evolving speaking,
232
709981
2545
語言是由很多人逐漸發展而成
11:52
about 100 billion people in fact.
233
712526
2806
大概有一千億的人類吧
11:55
What's important though is that writing only emerged
234
715332
2782
然而重點是寫作約在五千年前
11:58
about 5,000 years ago. So what that means is that
235
718114
3587
才開始發展 意思就是
12:01
all the people before there was any writing,
236
721701
2392
在那之前完全沒有任何紀錄
12:04
every word that they ever said, every utterance
237
724093
5586
他們所說的任何一句話 所有言論
12:09
disappeared. No trace. Evanescent. Gone.
238
729679
4752
都消失無蹤 毫無痕跡 人間蒸發
12:14
So we've been evolving to talk in a way in which
239
734431
4065
所以人類是在毫無紀錄的情況下
12:18
there is no record. In fact, even the next big change
240
738496
5917
逐步演化出語言 實際在書寫上另一個歷史上巨大的改變
12:24
to writing was only 500 years ago now,
241
744413
2468
是僅僅五百年前
12:26
with the printing press, which is very recent in our past,
242
746881
2379
也就是印刷業的出現 而那也已經算是很近代的事
12:29
and literacy rates remained incredibly low right up until World War II,
243
749260
4242
在二次世界大戰之前 識字率還是相當的低
12:33
so even the people of the last two millennia,
244
753502
3384
所以過去兩千年
12:36
most of the words they ever said -- poof! -- disappeared.
245
756886
5032
大部分人們所說的話 全部消失了
12:41
Let's turn to now, the networked age.
246
761918
3591
回到現在 網路世代
12:45
How many of you have recorded something today?
247
765509
4712
今天現場有多少人紀錄了某件事?
12:50
Anybody do any writing today? Did anybody write a word?
248
770221
3177
今天有誰寫下任何東西? 有人寫過一個字嗎?
12:53
It looks like almost every single person here recorded something.
249
773398
4226
看起來幾乎每一個人都有記錄一些事
12:57
In this room, right now, we've probably recorded more
250
777624
3048
現在 這個房間裡 我們所記錄的可能已經
13:00
than almost all of human pre-ancient history.
251
780672
4542
超過整個人類史前時代所有的
13:05
That is crazy. We're entering this amazing period
252
785214
3230
這是非常不可思議的 我們將進入一個時代
13:08
of flux in human evolution where we've evolved to speak
253
788444
4015
一個變遷的時代 在人類演化史中 曾經歷過
13:12
in a way in which our words disappear, but we're in
254
792459
2701
語言難以留下紀錄 而今
13:15
an environment where we're recording everything.
255
795160
2903
我們則處在隨時隨地都在紀錄的環境
13:18
In fact, I think in the very near future, it's not just
256
798063
2337
事實上 我認為在不久的將來 不只是
13:20
what we write that will be recorded, everything we do
257
800400
2349
我們所寫下的會被紀錄 我們所作的任何事情
13:22
will be recorded.
258
802749
2333
都會留下紀錄
13:25
What does that mean? What's the next big idea from that?
259
805082
4456
這意謂著什麼呢? 代表著什麼新的概念呢?
13:29
Well, as a social scientist, this is the most amazing thing
260
809538
4250
嗯 身為一個社會學家 這是世上最棒的事情
13:33
I have ever even dreamed of. Now, I can look at
261
813788
3547
我從未夢想過的 現在 我可以找到
13:37
all those words that used to, for millennia, disappear.
262
817335
3611
在千年以前是會消失的文字紀錄
13:40
I can look at lies that before were said and then gone.
263
820946
4248
我也可以知道 那些以往是說過就不見的謊言
13:45
You remember those Astroturfing reviews that we were
264
825194
3520
你們還記得那些偽草根的評論嗎? 我們剛剛曾經
13:48
talking about before? Well, when they write a fake review,
265
828714
3503
討論過的那些? 當他們寫下一個假的評價
13:52
they have to post it somewhere, and it's left behind for us.
266
832217
2704
他們必須在某個地方貼文 好讓我們能夠看見
13:54
So one thing that we did, and I'll give you an example of
267
834921
2435
因此我們所做的事情是 舉個例來說
13:57
looking at the language, is we paid people
268
837356
2495
看看螢幕上這些文字 是我們支付人們
13:59
to write some fake reviews. One of these reviews is fake.
269
839851
3535
所寫的一些假的評論 其中一個評價不是真的
14:03
The person never was at the James Hotel.
270
843386
1943
這個人從未到過詹姆士飯店
14:05
The other review is real. The person stayed there.
271
845329
2922
另外一個評論則是真的 這個人曾經到過那裡
14:08
Now, your task now is to decide
272
848251
3527
現在 你們的任務是判別
14:11
which review is fake?
273
851778
4073
哪一個評價是假的?
14:15
I'll give you a moment to read through them.
274
855851
4186
我會給你們一些時間讀這個文章
14:20
But I want everybody to raise their hand at some point.
275
860037
2287
但是我要每一個人都要舉手發表
14:22
Remember, I study deception. I can tell if you don't raise your hand.
276
862324
4231
請記住 我研究詐欺 對我來說判別你們有沒有舉手是很容易的
14:26
All right, how many of you believe that A is the fake?
277
866555
4570
好的 有多少人認為A所說的是假的?
14:33
All right. Very good. About half.
278
873154
2142
很好 大約有一半
14:35
And how many of you think that B is?
279
875296
3615
那有多少人認為B所說的是假的呢?
14:38
All right. Slightly more for B.
280
878911
2529
好的 B的較多一些
14:41
Excellent. Here's the answer.
281
881440
2592
太棒了 公佈答案
14:44
B is a fake. Well done second group. You dominated the first group. (Laughter)
282
884032
6581
B評論是假的 第二組人做的很好 你們擊敗了第一組人 (笑)
14:50
You're actually a little bit unusual. Every time we demonstrate this,
283
890613
2846
實際上這是有一點反常的現象 我們每次做此示範時
14:53
it's usually about a 50-50 split, which fits
284
893459
2746
通常是一半一半的比例 與
14:56
with the research, 54 percent. Maybe people here
285
896205
2646
研究結果相近 即約百分之五十四的比例 大概
14:58
in Winnipeg are more suspicious and better at figuring it out.
286
898851
3770
在現場(Winnipeg)的人比較多疑一些 也比較能知道何者為真
15:02
Those cold, hard winters, I love it.
287
902621
2688
大概是這個寒冷的冬天所致 太棒了
15:05
All right, so why do I care about this?
288
905309
3054
很好 那為什麼要注意這個呢?
15:08
Well, what I can do now with my colleagues in computer science
289
908363
3268
嗯 現在我可以和我同事利用電腦所做的是
15:11
is we can create computer algorithms that can analyze
290
911631
3232
我們可以設計會分析的電腦演算法
15:14
the linguistic traces of deception.
291
914863
2900
來判斷欺騙所留在文字上的痕跡
15:17
Let me highlight a couple of things here
292
917763
1833
讓我來指出在假評論中的一些重點
15:19
in the fake review. The first is that liars tend to think
293
919596
3443
第一個是 通常騙子較傾向於敘事型態
15:23
about narrative. They make up a story:
294
923039
1588
也就是說他們會編故事
15:24
Who? And what happened? And that's what happened here.
295
924627
3186
是誰? 發生什麼事情? 而這就是會發生的情況
15:27
Our fake reviewers talked about who they were with
296
927813
2289
假的評論者會寫出他們跟誰在一起
15:30
and what they were doing. They also used the first person singular, I,
297
930102
4765
他們做了什麼事情 也比較會用第一人稱單數的語態 "我"
15:34
way more than the people that actually stayed there.
298
934867
2469
比起實際上待過那裡的人使用"我"超過很多
15:37
They were inserting themselves into the hotel review,
299
937336
4696
他們會將自己強塞入飯店的評論中
15:42
kind of trying to convince you they were there.
300
942032
1696
很想要說服你們 他們曾經待過那裡
15:43
In contrast, the people that wrote the reviews that were actually there,
301
943728
4015
相反地 實際去過的人們在寫評論時
15:47
their bodies actually entered the physical space,
302
947743
2432
他們身體是確實進入到那個物質的環境
15:50
they talked a lot more about spatial information.
303
950175
2899
他們會偏向寫出更多關於空間的資訊
15:53
They said how big the bathroom was, or they said,
304
953074
2517
他們會說浴室有多大 或者是
15:55
you know, here's how far shopping is from the hotel.
305
955591
4520
你知道 距離飯店多遠才可以購物
16:00
Now, you guys did pretty well. Most people perform at chance at this task.
306
960111
4161
現在 你們都相當厲害 大部分的人只能憑機率判斷
16:04
Our computer algorithm is very accurate, much more accurate
307
964272
2758
我們的電腦演算法則是非常準確的 遠較人類準確
16:07
than humans can be, and it's not going to be accurate all the time.
308
967030
3291
比人類厲害很多 而這也無法每次都準確
16:10
This isn't a deception-detection machine to tell
309
970321
2030
這並不是偵測謊言的機器 來去判斷
16:12
if your girlfriend's lying to you on text messaging.
310
972351
2501
你女朋友是否在短訊上欺騙你們
16:14
We believe that every lie now, every type of lie --
311
974852
3564
我們相信每個謊言 每種形式的謊言
16:18
fake hotel reviews, fake shoe reviews,
312
978416
3787
假的飯店評價 假的鞋子評論
16:22
your girlfriend cheating on you with text messaging --
313
982203
2914
你女友在短訊上欺騙你
16:25
those are all different lies. They're going to have
314
985117
1505
各式這類的謊言 都會有
16:26
different patterns of language. But because everything's
315
986622
2859
語言上面不同的型態 但因為每件事情
16:29
recorded now, we can look at all of those kinds of lies.
316
989481
4689
現在都有了紀錄 我們才可以仔細研究那些形式的謊言
16:34
Now, as I said, as a social scientist, this is wonderful.
317
994170
3993
現在 就如同我所說的 身為一個社會科學家 那實在是太棒了
16:38
It's transformational. We're going to be able to learn
318
998163
2087
那產生了巨大的改變 我們可以從中了解
16:40
so much more about human thought and expression,
319
1000250
3802
更多關於人類的想法和表達方式
16:44
about everything from love to attitudes,
320
1004052
4398
從愛到各種看法
16:48
because everything is being recorded now, but
321
1008450
1960
因為現在所有事情都紀錄了下來 但
16:50
what does it mean for the average citizen?
322
1010410
2404
對於一般人來說有什麼意義呢?
16:52
What does it mean for us in our lives?
323
1012814
2802
對我們生活有什麼樣的影響?
16:55
Well, let's forget deception for a bit. One of the big ideas,
324
1015616
3673
我們先忘掉欺騙 其中一個重要的概念
16:59
I believe, is that we're leaving these huge traces behind.
325
1019289
3688
我相信是 我們將會留下可尋的痕跡
17:02
My outbox for email is massive,
326
1022977
3216
我電郵的寄件箱有很多郵件
17:06
and I never look at it. I write all the time,
327
1026193
3337
但我從未看過它 我時常在寫字
17:09
but I never look at my record, at my trace.
328
1029530
3438
但我從未觀察我個人的紀錄 我的過去
17:12
And I think we're going to see a lot more of that,
329
1032968
1567
我想 在未來我們能夠了解更多
17:14
where we can reflect on who we are by looking at
330
1034535
3161
我們可以透過了解我們所寫的 所說的 和所做的
17:17
what we wrote, what we said, what we did.
331
1037696
3618
反思我們是誰
17:21
Now, if we bring it back to deception, there's a couple
332
1041314
2272
現在 再回頭看欺騙這個問題 有一些事情
17:23
of take-away things here.
333
1043586
1977
希望你們離開時能夠記住
17:25
First, lying online can be very dangerous, right?
334
1045563
4488
第一 在網路上欺騙是非常危險的 是吧?
17:30
Not only are you leaving a record for yourself on your machine,
335
1050051
2706
不但你會在你的電腦上留下自己的紀錄
17:32
but you're leaving a record on the person that you were lying to,
336
1052757
4275
你也會在你所欺騙的人那裡留下紀錄
17:37
and you're also leaving them around for me to analyze
337
1057032
1760
你也會將那些紀錄提供給我來分析
17:38
with some computer algorithms.
338
1058792
1454
用電腦演算法來加以辨別
17:40
So by all means, go ahead and do that, that's good.
339
1060246
3173
所以說 你們大可以繼續這樣做 對我來說是件好事
17:43
But when it comes to lying and what we want to do
340
1063419
4154
但是討論欺騙這個行為 和我們想如何影響
17:47
with our lives, I think we can go back to
341
1067573
2553
自己的生活時 我認為我們可以回到
17:50
Diogenes and Confucius. And they were less concerned
342
1070126
3749
第歐根尼和孔子 他們較不關心
17:53
about whether to lie or not to lie, and more concerned about
343
1073875
2832
欺騙或者不欺騙 而更在意
17:56
being true to the self, and I think this is really important.
344
1076707
3285
對自己應該保持真我 而我認為那才是真正重要的
17:59
Now, when you are about to say or do something,
345
1079992
4183
現在 當你們要去說或者做某件事情
18:04
we can think, do I want this to be part of my legacy,
346
1084175
4560
我們該思考一下 我想要這成為我所遺留來的一部分嗎?
18:08
part of my personal record?
347
1088735
2713
我個人紀錄裡的一部分?
18:11
Because in the digital age we live in now,
348
1091448
2657
因為在我們所生活的這個數位世代
18:14
in the networked age, we are all leaving a record.
349
1094105
4464
這個網路世代 我們都會留下痕跡
18:18
Thank you so much for your time,
350
1098569
1695
非常感謝你們的參與
18:20
and good luck with your record. (Applause)
351
1100264
4447
祝福你們會有美好的人生紀錄 (鼓掌)
關於本網站

本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。

https://forms.gle/WvT1wiN1qDtmnspy7