请双击下面的英文字幕来播放视频。
翻译人员: Yinchun Rui
校对人员: Yolanda Zhang
00:12
So why do you think
the rich should pay more in taxes?
0
12800
3560
为什么你认为富人就要多交税?
00:16
Why did you buy the latest iPhone?
1
16400
2376
你为什么要买最新的
(苹果手机)iphone?
00:18
Why did you pick your current partner?
2
18800
2456
你为什么选择了你现在的伴侣?
00:21
And why did so many people
vote for Donald Trump?
3
21280
3416
还有为什么有那么多人
投了唐纳德·特朗普的票?
00:24
What were the reasons, why did they do it?
4
24720
2520
这背后的原因是什么,
他们为什么这样做?
00:27
So we ask this kind
of question all the time,
5
27990
2106
我们一直都问这些问题,
00:30
and we expect to get an answer.
6
30120
1736
并且也希望得到答案。
00:31
And when being asked,
we expect ourselves to know the answer,
7
31880
3136
我们也会被问到这些问题,
也希望自己知道答案,
00:35
to simply tell why we did as we did.
8
35040
2480
能够简单的回答
为什么我们要这么做。
00:38
But do we really know why?
9
38440
1720
但是我们真的知道为什么吗?
00:41
So when you say that you prefer
George Clooney to Tom Hanks,
10
41000
3456
当你说你喜欢乔治·克鲁尼
多过汤姆·克鲁斯,
00:44
due to his concern for the environment,
11
44479
2057
是因为你觉得
前者更关注环境问题,
00:46
is that really true?
12
46560
1200
那是真的吗?
00:48
So you can be perfectly sincere
and genuinely believe
13
48560
2496
你能够诚心诚意的相信
00:51
that this is the reason
that drives your choice,
14
51080
2936
那就是驱使你做出选择的原因,
00:54
but to me, it may still feel
like something is missing.
15
54040
2600
但是对于我来说,
这其中还是遗漏了一些东西。
00:57
As it stands, due to
the nature of subjectivity,
16
57560
3176
事实表明,由于
主观臆断的自然属性,
01:00
it is actually very hard to ever prove
that people are wrong about themselves.
17
60760
4320
很难证明人们自己会
对自己有错误的认知。
01:06
So I'm an experimental psychologist,
18
66600
2136
我是一个实验心理学家,
01:08
and this is the problem
we've been trying to solve in our lab.
19
68760
3536
这个问题是我们实验室
长久以致力解决的问题。
01:12
So we wanted to create an experiment
20
72320
2176
我们计划设计一个实验,
01:14
that would allow us to challenge
what people say about themselves,
21
74520
3536
能够使我们挑战
人们对自己的认知,
01:18
regardless of how certain they may seem.
22
78080
2680
不论看起来他们多么的认同自己。
01:21
But tricking people
about their own mind is hard.
23
81960
2736
但是欺骗人们的思想是困难的。
01:24
So we turned to the professionals.
24
84720
2376
于是我们转向专业人员。
01:27
The magicians.
25
87120
1200
魔术师。
01:29
So they're experts at creating
the illusion of a free choice.
26
89120
2896
他们很善于创造
一个自由选择的幻觉。
当他们说:“选张卡片,任意一张“,
01:32
So when they say, "Pick a card, any card,"
27
92040
2296
01:34
the only thing you know
is that your choice is no longer free.
28
94360
2920
你能知道就是你的选择不再随意。
01:38
So we had a few fantastic
brainstorming sessions
29
98200
2376
因此我们和一组瑞典的魔术师,
01:40
with a group of Swedish magicians,
30
100600
1856
来了几轮精彩的头脑风暴,
01:42
and they helped us create a method
31
102480
1642
他们帮我们想出了一些方法,
01:44
in which we would be able to manipulate
the outcome of people's choices.
32
104147
3973
能够让我们操控人们选择的结果。
01:48
This way we would know
when people are wrong about themselves,
33
108760
2936
这样一来我们就能知道
人们何时对自己的认知是错误的,
01:51
even if they don't know this themselves.
34
111720
2040
甚至他们自己都没意识到这一点。
01:54
So I will now show you
a short movie showing this manipulation.
35
114480
4656
我现在给你播放一段短片
来演示这种操控。
01:59
So it's quite simple.
36
119160
1416
这相当简单。
02:00
The participants make a choice,
37
120600
2136
参与者要做出选择,
02:02
but I end up giving them the opposite.
38
122760
2256
但我最终会给出
与他们的选择相反的结果。
到时我们想看的是:
他们的反应如何,他们怎么说。
02:05
And then we want to see:
How did they react, and what did they say?
39
125040
3520
02:09
So it's quite simple, but see
if you can spot the magic going on.
40
129240
3160
这很简单,但是要看你
能不能看出到底发生了什么。
02:13
And this was shot with real participants,
they don't know what's going on.
41
133440
3520
这里拍摄的都是真实的参与者,
他们对幕后的一切毫不知情。
02:19
(Video) Petter Johansson:
Hi, my name's Petter.
42
139000
2216
(短片)培特·乔纳森:
嗨,我的名字是培特。
女士:嗨,我是贝卡。
02:21
Woman: Hi, I'm Becka.
43
141240
1215
培特:我要给你看像这样的图片。
02:22
PJ: I'm going to show you
pictures like this.
44
142479
2137
然后你要决定哪一张最吸引你。
02:24
And you'll have to decide
which one you find more attractive.
45
144640
2896
02:27
Becka: OK.
46
147560
1216
贝卡:好的。
02:28
PJ: And then sometimes,
I will ask you why you prefer that face.
47
148800
3176
培特:然后我还会问你
为什么你喜欢那张脸。
贝卡:好的。
02:32
Becka: OK.
48
152000
1216
培特:准备好了吗?
贝卡:好了。
02:33
PJ: Ready?
Becka: Yeah.
49
153240
1200
02:43
PJ: Why did you prefer that one?
50
163120
1816
培特:为什么你喜欢那一张?
02:44
Becka: The smile, I think.
51
164960
1496
贝卡:笑容,我认为。
02:46
PJ: Smile.
52
166480
1200
彼得:笑容。
02:52
Man: One on the left.
53
172400
1240
男士:左边的那张。
02:57
Again, this one just struck me.
54
177520
1640
这张恰巧使我很着迷。
02:59
Interesting shot.
55
179760
1616
很有趣的拍照。
03:01
Since I'm a photographer,
I like the way it's lit and looks.
56
181400
3000
我是摄影师,比较喜欢
它展现光线与容貌的方式。
03:06
Petter Johansson: But now comes the trick.
57
186280
2040
(旁白)培特:
下面,见证奇迹的时刻到了。
03:10
(Video) Woman 1: This one.
58
190120
1280
(短片)女士1: 这一张。
03:16
PJ: So they get the opposite
of their choice.
59
196240
2280
(旁白)培特:他们拿到的是
之前没有选的那张照片。
03:20
And let's see what happens.
60
200520
1600
让我们来看看会发生什么事。
03:28
Woman 2: Um ...
61
208240
1200
(短片)女士2: 嗯。。。
03:35
I think he seems a little more
innocent than the other guy.
62
215760
2800
我认为他看起来
比另一个人无辜些。
03:45
Man: The one on the left.
63
225360
1240
男士:左边的这位。
03:49
I like her smile
and contour of the nose and face.
64
229280
3696
我喜欢它的笑容,
还有她鼻子和脸的轮廓。
有点儿意思,
还有她的头发。
03:53
So it's a little more interesting
to me, and her haircut.
65
233000
2760
04:00
Woman 3: This one.
66
240040
1200
女士3:这一张。
04:03
I like the smirky look better.
67
243520
1576
我喜欢这种得意的笑容。
04:05
PJ: You like the smirky look better?
68
245120
2000
培特:你比较喜欢得意的表情?
04:09
(Laughter)
69
249680
3176
(观众笑声)
04:12
Woman 3: This one.
70
252880
1200
女士3:这一张。
04:15
PJ: What made you choose him?
71
255280
1400
彼得:你为什么选这张?
04:17
Woman 3: I don't know,
he looks a little bit like the Hobbit.
72
257520
2896
女士3:我不知道,他看起来
有点儿像霍比特人。
04:20
(Laughter)
73
260440
2056
(观众笑声)
04:22
PJ: And what happens in the end
74
262520
1496
(旁白)培特:
当我告诉他们
这个实验的真实目的后,
会发生什么事呢?
04:24
when I tell them the true nature
of the experiment?
75
264040
3096
(短片)是的,就是这些。
我还要问一些问题。
04:27
Yeah, that's it. I just have to
ask a few questions.
76
267160
2456
04:29
Man: Sure.
77
269640
1216
男士:当然。
04:30
PJ: What did you think
of this experiment, was it easy or hard?
78
270880
2976
培特:你觉得这实验怎么样,
感觉容易还是难?
04:33
Man: It was easy.
79
273880
1240
男士:容易。
04:36
PJ: During the experiments,
80
276040
1336
培特:在实验当中,
04:37
I actually switched
the pictures three times.
81
277400
3336
我其实将照片偷换了三次。
04:40
Was this anything you noticed?
82
280760
1576
你有注意到什么了吗?
04:42
Man: No. I didn't notice any of that.
83
282360
1816
男士:不,我没有注意到什么。
04:44
PJ: Not at all?
Man: No.
84
284200
1496
培特:一点都没有吗?
男士:没有。
04:45
Switching the pictures as far as ...
85
285720
2096
换照片是怎么回事。。。
04:47
PJ: Yeah, you were pointing at one of them
but I actually gave you the opposite.
86
287840
3816
培特:就是你选了其中的一张,
而我给你的是另外一张。
04:51
Man: The opposite one.
OK, when you --
87
291680
1816
男士:相反的那张,
好的,当你——
不,这是展示我的
注意力持续时间多长。
04:53
No. Shows you how much
my attention span was.
88
293520
2256
04:55
(Laughter)
89
295800
1520
(观众笑声)
04:58
PJ: Did you notice that sometimes
during the experiment
90
298880
3016
培特:在实验进行当中,
你有注意到
05:01
I switched the pictures?
91
301920
2136
我有几次偷换了照片了吗?
05:04
Woman 2: No, I did not notice that.
92
304080
2016
女士2:不,我没有注意到。
05:06
PJ: You were pointing at one,
but then I gave you the other one.
93
306120
3000
培特:你指的一张,但是
我给你的却是另外一张。
05:09
No inclination of that happening?
94
309920
1616
没有发现吗?
05:11
Woman 2: No.
95
311560
1576
女士2:没有。
05:13
Woman 2: I did not notice.
96
313160
1256
女士2:我没有注意到。
05:14
(Laughs)
97
314440
1936
(笑声)
05:16
PJ: Thank you.
98
316400
1216
培特:谢谢。
05:17
Woman 2: Thank you.
99
317640
1376
女士2:谢谢。
(短片结束)
05:19
PJ: OK, so as you probably
figured out now,
100
319040
2056
培特:那么你现在大概能猜到了,
骗术就是,我每只手里
都拿了两张牌,
05:21
the trick is that I have
two cards in each hand,
101
321120
2256
05:23
and when I hand one of them over,
102
323400
1576
当我把背面那张牌推过去的时候,
黑色那张原本被选的牌就在
黑色桌面的映衬下消失,被我藏起来了。
05:25
the black one kind of disappears
into the black surface on the table.
103
325000
4360
05:30
So using pictures like this,
104
330640
1736
像这样使用照片,
05:32
normally not more than 20 percent
of the participants detect these tries.
105
332400
4376
通常有不到20%的
参与者会发现这些骗局。
05:36
And as you saw in the movie,
106
336800
1416
正如你在影片中看到的,
05:38
when in the end
we explain what's going on,
107
338240
3176
最后我向他们解释
发生了什么的时候,
05:41
they're very surprised and often refuse
to believe the trick has been made.
108
341440
4376
他们都非常的惊讶并且通常
拒绝相信其中有诈。
05:45
So this shows that this effect
is quite robust and a genuine effect.
109
345840
4776
这就表明,这种效应
是十分强烈而又真实的。
05:50
But if you're interested
in self-knowledge, as I am,
110
350640
2656
但是,如果你和我一样对
“自知之明”感兴趣的话。
05:53
the more interesting bit is,
111
353320
1336
最有趣的部分是:
05:54
OK, so what did they say
when they explained these choices?
112
354680
3936
他们会如何解释
自己所作出的选择?
05:58
So we've done a lot of analysis
113
358640
1496
为此,我们做了很多关于
06:00
of the verbal reports
in these experiments.
114
360160
2080
这个实验当中口头报告的分析。
06:03
And this graph simply shows
115
363360
2456
这张图表明,
06:05
that if you compare
what they say in a manipulated trial
116
365840
4776
如果你将有骗局的那组的说辞
06:10
with a nonmanipulated trial,
117
370640
1376
和没有骗局的那组相比较,
06:12
that is when they explain
a normal choice they've made
118
372040
2776
你会发现,他们对自己
正常选择的解释
06:14
and one where we manipulated the outcome,
119
374840
2496
和经过操控后的解释是
06:17
we find that they are remarkably similar.
120
377360
2456
非常相似的。
06:19
So they are just as emotional,
just as specific,
121
379840
3056
他们都同样的情绪化,目标明确,
06:22
and they are expressed
with the same level of certainty.
122
382920
3200
并且他们表达的
肯定程度也处于同一水平。
06:27
So the strong conclusion to draw from this
123
387120
2336
从这个实验中得到的
强有力的结论是,
06:29
is that if there are no differences
124
389480
2216
如果在真正的选择和
06:31
between a real choice
and a manipulated choice,
125
391720
3696
被操控的选择之间没有差异的话,
06:35
perhaps we make things up all the time.
126
395440
2440
或许我们一直都在编造理由。
06:38
But we've also done studies
127
398680
1336
但是我们也做过研究,
06:40
where we try to match what they say
with the actual faces.
128
400040
3016
尝试将实际的面容
与和他们的描述相匹配。
06:43
And then we find things like this.
129
403080
1880
然后我们发现了这样的事情。
06:45
So here, this male participant,
he preferred the girl to the left,
130
405760
5056
这个男性参与者,他偏好左面的女人,
06:50
he ended up with the one to the right.
131
410840
1856
但结果他却是选的右边的那位。
06:52
And then, he explained
his choice like this.
132
412720
2816
然后,他给出的解释是:
06:55
"She is radiant.
133
415560
1296
“她明艳动人,
06:56
I would rather have approached her
at the bar than the other one.
134
416880
3096
我宁可在酒吧碰到是她
而不是另外一位。
并且我喜欢这耳环。“
07:00
And I like earrings."
135
420000
1616
07:01
And whatever made him choose
the girl on the left to begin with,
136
421640
3496
但开始不管是什么理由
让他选择了左边的女人,
07:05
it can't have been the earrings,
137
425160
1576
耳环肯定不是其中一个,
07:06
because they were actually
sitting on the girl on the right.
138
426760
2856
因为,右边的女人才戴耳环。
07:09
So this is a clear example
of a post hoc construction.
139
429640
3776
这明显是一个“事后构建”的例子。
07:13
So they just explained
the choice afterwards.
140
433440
2800
因此,他们只是后来
才对作出的选择进行解释。
07:17
So what this experiment shows is,
141
437320
2296
那么这个实验表明,
07:19
OK, so if we fail to detect
that our choices have been changed,
142
439640
3656
如果我们没有发现
自己的选择被调换了,
07:23
we will immediately start
to explain them in another way.
143
443320
3200
我们会立即开始用
另外一种方式来解释。
07:27
And what we also found
144
447520
1256
我们还发现
07:28
is that the participants
often come to prefer the alternative,
145
448800
3216
参与者会渐渐喜欢上另外那个,
07:32
that they were led to believe they liked.
146
452040
2256
他们被引导,从而相信
那就是他们喜欢的。
07:34
So if we let them do the choice again,
147
454320
2016
如果我们再让他们做出一次选择,
07:36
they will now choose the face
they had previously rejected.
148
456360
3760
他们就会选择曾经
被他们拒绝掉的那个。
07:41
So this is the effect
we call "choice blindness."
149
461520
2296
这就是我们所说的
“选择盲目性”效应。
07:43
And we've done
a number of different studies --
150
463840
2216
并且我们做了很多不同的研究——
07:46
we've tried consumer choices,
151
466080
2536
我们在消费者选择上做过实验,
07:48
choices based on taste and smell
and even reasoning problems.
152
468640
4416
建立在味觉和嗅觉上的实验,
甚至还有推理问题。
07:53
But what you all want to know is of course
153
473080
2056
但你们都想知道的是,
这个现象能否适用于更复杂,
更有意义的选择上呢?
07:55
does this extend also
to more complex, more meaningful choices?
154
475160
3936
07:59
Like those concerning
moral and political issues.
155
479120
3080
比如那些关注于
道德和政治的问题。
08:04
So the next experiment,
it needs a little bit of a background.
156
484400
4216
下一个实验需要一些背景知识。
08:08
So in Sweden, the political landscape
157
488640
4256
在瑞典,国家的政治事务是
08:12
is dominated by a left-wing
and a right-wing coalition.
158
492920
3360
由左翼和右翼的联合政府主导。
08:17
And the voters may move a little bit
between the parties within each coalition,
159
497720
4416
投票人可能会在每个联盟中的
两党之间有一点点犹疑,
08:22
but there is very little movement
between the coalitions.
160
502160
2760
但在不同的联盟之间
就没有那么多犹疑。
08:25
And before each elections,
161
505680
1976
在每次选举之前,
08:27
the newspapers and the polling institutes
162
507680
4216
报纸或投票机构,
08:31
put together what they call
"an election compass"
163
511920
2616
合起来拿出一个所谓的
“选举指南”,
08:34
which consists of a number
of dividing issues
164
514560
3336
这个包含了一系列的具有
分化性的问题,
08:37
that sort of separates the two coalitions.
165
517920
2336
用来分离开两个联盟。
08:40
Things like if tax on gasoline
should be increased
166
520280
3735
那些议题包括,
比如燃油费是否要增加,
08:44
or if the 13 months of paid parental leave
167
524039
4096
或者,父母是否应该平均
08:48
should be split equally
between the two parents
168
528159
2496
享用那个13个月的产假,
08:50
in order to increase gender equality.
169
530679
2721
以便增加性别平等的机会。
08:54
So, before the last Swedish election,
170
534840
2216
在瑞典最后一次选举之前,
08:57
we created an election compass of our own.
171
537080
2600
我们自己做了一个选举指南。
09:00
So we walked up to people in the street
172
540480
2136
我们走到街上去问路人,
09:02
and asked if they wanted
to do a quick political survey.
173
542640
3336
问他们是否愿意
做一个快速的政治调查问卷。
首先,我们让他们在两个联盟
09:06
So first we had them state
their voting intention
174
546000
2456
09:08
between the two coalitions.
175
548480
1360
之间说出他们的选举倾向。
09:10
Then we asked them
to answer 12 of these questions.
176
550560
3776
然后让他们回答这12个问题。
09:14
They would fill in their answers,
177
554360
1976
他们会写出他们的答案,
09:16
and we would ask them to discuss,
178
556360
1616
然后我会让他们来讨论,
好,为什么你认为要增加燃油税?
09:18
so OK, why do you think
tax on gas should be increased?
179
558000
5496
09:23
And we'd go through the questions.
180
563520
2096
我们接着把问题都问完。
09:25
Then we had a color coded template
181
565640
3896
然后我们用涂有颜色的模版
09:29
that would allow us
to tally their overall score.
182
569560
2936
记录他们的总分数。
09:32
So this person would have
one, two, three, four
183
572520
3456
因此,这个人将会有1,2,3,4
5,6,7,8,
9分记在左边。
09:36
five, six, seven, eight, nine
scores to the left,
184
576000
3296
09:39
so he would lean to the left, basically.
185
579320
2680
因此,基本上他会倾向于左翼。
09:42
And in the end, we also had them
fill in their voting intention once more.
186
582800
4440
最后,我们再让他们填写投票意向。
09:48
But of course, there was
also a trick involved.
187
588160
2280
当然,这里也有诈。
09:51
So first, we walked up to people,
188
591360
2176
首先,我们找到一些路人,
09:53
we asked them
about their voting intention
189
593560
2056
询问他们的投票意向,
09:55
and then when they started filling in,
190
595640
2256
然后当他们填写的时候,
09:57
we would fill in a set of answers
going in the opposite direction.
191
597920
5456
我们会填写一份相反的答案,
10:03
We would put it under the notepad.
192
603400
2576
并放在写字板的下方。
然后,当我们拿到填好的问卷时,
10:06
And when we get the questionnaire,
193
606000
2776
10:08
we would simply glue it on top
of the participant's own answer.
194
608800
3320
会直接把它粘到参与者
自己的答案上面。
10:16
So there, it's gone.
195
616000
1240
于是乎,它不见了。
10:24
And then we would ask
about each of the questions:
196
624280
2376
然后,我们会再问他们这几个问题:
10:26
How did you reason here?
197
626680
1536
这里你给出的理由是什么?
10:28
And they'll state the reasons,
198
628240
1736
然后他们会陈述理由,
同时,我们还会算他们的总分。
10:30
together we will sum up
their overall score.
199
630000
2480
10:34
And in the end, they will state
their voting intention again.
200
634800
3680
最后,他们还会再次陈述
自己的投票意向。
10:41
So what we find first of all here,
201
641960
1656
那么,我们首先了解到的是,
10:43
is that very few of these
manipulations are detected.
202
643640
4216
这些小把戏很少会被揭穿。
10:47
And they're not detected
in the sense that they realize,
203
647880
2656
即便被发现,他们也不会觉得,
10:50
"OK, you must have changed my answer,"
204
650560
1856
"好吧,你肯定是换掉了我的答案,“
10:52
it was more the case that,
205
652440
1256
更可能是这样,
10:53
"OK, I must've misunderstood
the question the first time I read it.
206
653720
3176
“好吧,我第一次读题目的时候
一定是误解它了。
10:56
Can I please change it?"
207
656920
1240
我可以换回答案吗?“
10:59
And even if a few of these
manipulations were changed,
208
659080
5136
即便部分被篡改的答案
被改回来了,
11:04
the overall majority was missed.
209
664240
2136
总的来说,大部分还是被忽略了。
11:06
So we managed to switch 90 percent
of the participants' answers
210
666400
3656
我们成功替换了90%参与者的答案,
11:10
from left to right, right to left,
their overall profile.
211
670080
3160
从左翼到右翼,从右翼到左翼,
他们整个的概述。
11:14
And what happens then when
they are asked to motivate their choices?
212
674800
4400
当他们被问及为什么会
选择这个答案时,会发生什么事呢?
11:20
And here we find much more
interesting verbal reports
213
680160
3056
在这里,我们发现了比起面部测试
11:23
than compared to the faces.
214
683240
2016
更有趣的口头报告。
11:25
People say things like this,
and I'll read it to you.
215
685280
3360
人们这样说,我读给你们听。
11:29
So, "Large-scale governmental surveillance
of email and internet traffic
216
689720
3736
他们说:“ 政府大规模针对
电子邮件和网络系统的监管
11:33
ought to be permissible as means to combat
international crime and terrorism."
217
693480
4336
应当是被允许的,这意味着
可以打击国际犯罪和恐怖组织。“
11:37
"So you agree to some extent
with this statement." "Yes."
218
697840
2716
“那么在一定程度上
你是同意这一陈述的。” “是的”。
11:40
"So how did you reason here?"
219
700580
1500
“那么,这里你给的理由是什么?”
11:43
"Well, like, as it is so hard to get
at international crime and terrorism,
220
703600
4936
“嗯,鉴于打击国际犯罪
和恐怖主义是非常困难的,
11:48
I think there should be
those kinds of tools."
221
708560
2776
我想那应该就是
可以采用的工具。”
11:51
And then the person remembers an argument
from the newspaper in the morning.
222
711360
3616
然后有个人记起早上的
报纸上有一段论述。
“就像早上报纸讲的那样,
11:55
"Like in the newspaper today,
223
715000
1616
11:56
it said they can like,
listen to mobile phones from prison,
224
716640
3376
据说,他们能够监听到从狱中
打进打出的电话,
12:00
if a gang leader tries to continue
his crimes from inside.
225
720040
3536
比如是否有黑帮头目想在狱中
继续从事他的犯罪活动。
12:03
And I think it's madness
that we have so little power
226
723600
2816
我认为不可思议的是,
12:06
that we can't stop those things
227
726440
1656
我们有希望
阻止此类事情发生,
12:08
when we actually have
the possibility to do so."
228
728120
2936
但是却没有足够的
能力做到这一点。“
12:11
And then there's a little bit
back and forth in the end:
229
731080
2696
最后还有一段犹豫不决的说辞:
12:13
"I don't like that they have access
to everything I do,
230
733800
2576
“我不喜欢他们介入到
我做的任何事情中,
但我还是认为这是长久之计。“
12:16
but I still think
it's worth it in the long run."
231
736400
2576
如果你不知道这个人刚刚
12:19
So, if you didn't know that this person
232
739000
2536
12:21
just took part in
a choice blindness experiment,
233
741560
2256
参加了那个盲选实验,
12:23
I don't think you would question
234
743840
1856
我想你不会质疑
12:25
that this is the true attitude
of that person.
235
745720
3120
这就是那个人的真实态度。
12:29
And what happens in the end,
with the voting intention?
236
749800
2856
那么最后的投票意向是怎样呢?
12:32
What we find -- that one is also
clearly affected by the questionnaire.
237
752680
4696
我们发现,人的思想也明显
受到了问卷的影响。
12:37
So we have 10 participants
238
757400
1736
我们有10个参与者
12:39
shifting from left to right
or from right to left.
239
759160
2976
从左翼转到右翼,
右翼换到左翼。
12:42
We have another 19
that go from clear voting intention
240
762160
2536
还有另外19个人的投票意向从
12:44
to being uncertain.
241
764720
1456
明确变到不明确。
12:46
Some go from being uncertain
to clear voting intention.
242
766200
3096
有些人的投票意向由不明确
转向明确。
12:49
And then there is a number of participants
staying uncertain throughout.
243
769320
4736
还有很多参与者从头到尾都不确定。
12:54
And that number is interesting
244
774080
1576
这个数字很有意思,
12:55
because if you look
at what the polling institutes say
245
775680
4616
因为,你若去看
民意调查机构的说法,
13:00
the closer you get to an election,
246
780320
1656
越接近大选时,
还能够受到影响的人,
13:02
the only people that are sort of in play
247
782000
2136
13:04
are the ones that are
considered uncertain.
248
784160
2656
就是那些犹豫不决的人。
13:06
But we show there is a much larger number
249
786840
3216
但是,我们的试验表明
有相当一部分人
13:10
that would actually consider
shifting their attitudes.
250
790080
2800
实际上还会考虑转变他们的态度。
13:13
And here I must point out, of course,
that you are not allowed to use this
251
793640
3496
在这里我还想指出的是,当然
你会被禁止在大选之前
13:17
as an actual method
to change people's votes
252
797160
2616
使用这项手段来
13:19
before an election,
253
799800
1496
改变人们的投票意向。
13:21
and we clearly debriefed them afterwards
254
801320
3616
之后我们还很清楚地告诉了他们,
13:24
and gave them every
opportunity to change back
255
804960
2296
我们给他们改回原来
13:27
to whatever they thought first.
256
807280
2480
他们所想的答案的机会。
13:30
But what this shows is
that if you can get people
257
810600
2336
但是这个试验表明,
如果你可以让这些人们
13:32
to see the opposite view and engage
in a conversation with themselves,
258
812960
5536
看到与他们相对的观点,并且
让他们仔细斟酌自己的想法,
13:38
that could actually make them
change their views.
259
818520
2920
那就可以使他们改变他们的观点。
13:42
OK.
260
822400
1200
好的。
13:44
So what does it all mean?
261
824760
1656
那么这一切都是什么意思?
13:46
What do I think is going on here?
262
826440
2416
我认为这里到底发生了什么呢?
13:48
So first of all,
263
828880
1216
首先,
13:50
a lot of what we call self-knowledge
is actually self-interpretation.
264
830120
4856
那些我们所谓的自知之明
其实是我们的自我诠释。
我明白我做了一个选择,
13:55
So I see myself make a choice,
265
835000
2496
13:57
and then when I'm asked why,
266
837520
2776
而当我被问起为什么时,
14:00
I just try to make
as much sense of it as possible
267
840320
2536
我仅仅是想让我的解释
14:02
when I make an explanation.
268
842880
1936
听起来尽可能的合理。
14:04
But we do this so quickly
and with such ease
269
844840
3016
但是我们迅速并且很容易地
完成了这一过程,
14:07
that we think we actually know the answer
when we answer why.
270
847880
4280
就是我们会误以为
自己已经知道答案了。
14:13
And as it is an interpretation,
271
853040
3096
因为这仅仅是一种诠释,
14:16
of course we sometimes make mistakes.
272
856160
2296
当然我们时常会犯错误。
14:18
The same way we make mistakes
when we try to understand other people.
273
858480
3520
当我们尝试去理解他人时,
我们会以同样的方式犯错误。
14:23
So beware when you ask people
the question "why"
274
863160
3696
当你问别人“为什么”的
问题时要小心,
14:26
because what may happen
is that, if you asked them,
275
866880
4896
因为将会发生的事是,
如果你问他们,
14:31
"So why do you support this issue?"
276
871800
4016
“为什么你会支持这个主张?”
14:35
"Why do you stay in this job
or this relationship?" --
277
875840
3216
“你为什么从事这份工作,
或持续这段感情?“
14:39
what may happen when you ask why
is that you actually create an attitude
278
879080
3416
其实你已经建立了一种态度,
14:42
that wasn't there
before you asked the question.
279
882520
2240
这种态度在你问
这个问题之前并不存在。
14:45
And this is of course important
in your professional life, as well,
280
885440
3176
当然这在你的职业生涯中也很重要,
14:48
or it could be.
281
888640
1216
或可能很重要。
14:49
If, say, you design something
and then you ask people,
282
889880
2536
比如你设计了一样东西,
然后问人们,
14:52
"Why do you think this is good or bad?"
283
892440
2256
“你为什么说它好,或者坏?”
14:54
Or if you're a journalist
asking a politician,
284
894720
3056
或者如果你是一个记者,
你问一个政治家,
14:57
"So, why did you make this decision?"
285
897800
2376
“你为什么要做这个决定?”
15:00
Or if indeed you are a politician
286
900200
1936
或者你就是一个政治家,
15:02
and try to explain
why a certain decision was made.
287
902160
2640
并且尝试解释做出
某一决定的原因。
15:06
So this may all seem a bit disturbing.
288
906080
3576
这一切看起来会有些让人不安。
15:09
But if you want to look at it
from a positive direction,
289
909680
3496
但是如果你从一个
正面的角度来看,
15:13
it could be seen as showing,
290
913200
1736
这可能就表明,
15:14
OK, so we're actually
a little bit more flexible than we think.
291
914960
3376
好吧,我们实际上比
自己想的要更灵活些。
15:18
We can change our minds.
292
918360
1896
我们可以改变我们的想法。
15:20
Our attitudes are not set in stone.
293
920280
2456
我们的态度也不是一成不变的。
15:22
And we can also change
the minds of others,
294
922760
3176
并且我们也可以
改变其他人的想法,
15:25
if we can only get them
to engage with the issue
295
925960
2376
只要让他们深入讨论问题,
15:28
and see it from the opposite view.
296
928360
1680
并从对立的角度来看。
15:31
And in my own personal life,
since starting with this research --
297
931400
3936
在我个人的生活中,
自从我开始这个实验——
15:35
So my partner and I,
we've always had the rule
298
935360
2576
我和我的合作者,
我们一直遵守一项原则,
15:37
that you're allowed to take things back.
299
937960
2296
就是你可以反悔。
15:40
Just because I said
I liked something a year ago,
300
940280
2336
就像我说的,
一年前我喜欢的东西,
15:42
doesn't mean I have to like it still.
301
942640
2040
并不意味着我现在还要喜欢它。
15:45
And getting rid of the need
to stay consistent
302
945480
2816
摆脱对维持一致性的需要,
15:48
is actually a huge relief and makes
relational life so mush easier to live.
303
948320
4360
其实是一个巨大的解脱,并且
可以让我们更好的经营人际关系。
15:53
Anyway, so the conclusion must be:
304
953720
2360
总之,结论就是:
要明白你不懂你自己。
15:57
know that you don't know yourself.
305
957320
2496
15:59
Or at least not as well
as you think you do.
306
959840
2320
或者,至少不像
你想的那么了解自己。
谢谢。
16:03
Thanks.
307
963480
1216
(鼓掌)
16:04
(Applause)
308
964720
4640
New videos
Original video on YouTube.com
关于本网站
这个网站将向你介绍对学习英语有用的YouTube视频。你将看到来自世界各地的一流教师教授的英语课程。双击每个视频页面上显示的英文字幕,即可从那里播放视频。字幕会随着视频的播放而同步滚动。如果你有任何意见或要求,请使用此联系表与我们联系。