請雙擊下方英文字幕播放視頻。
譯者: Lilian Chiu
審譯者: Yanyan Hong
00:12
So why do you think
the rich should pay more in taxes?
0
12800
3560
為什麼你會認為
有錢人應該繳比較多稅?
00:16
Why did you buy the latest iPhone?
1
16400
2376
為什麼你要買最新出的 iPhone?
00:18
Why did you pick your current partner?
2
18800
2456
為什麼你會選上你現在的伴侶?
00:21
And why did so many people
vote for Donald Trump?
3
21280
3416
為什麼有那麼多人投給川普?
00:24
What were the reasons, why did they do it?
4
24720
2520
理由是什麼?為什麼他們會這樣做?
00:27
So we ask this kind
of question all the time,
5
27990
2106
我們總是在問這類的問題,
00:30
and we expect to get an answer.
6
30120
1736
且我們期望能得到答案。
00:31
And when being asked,
we expect ourselves to know the answer,
7
31880
3136
當被問的時候,我們也
期望我們自己知道答案,
00:35
to simply tell why we did as we did.
8
35040
2480
很簡單地說出我們
所做所為背後的理由。
00:38
But do we really know why?
9
38440
1720
但我們真的知道為什麼嗎?
00:41
So when you say that you prefer
George Clooney to Tom Hanks,
10
41000
3456
所以,當你說,你喜歡
喬治克隆尼多於湯姆漢克,
00:44
due to his concern for the environment,
11
44479
2057
是因為他對環境比較關心,
00:46
is that really true?
12
46560
1200
真的是這樣嗎?
00:48
So you can be perfectly sincere
and genuinely believe
13
48560
2496
這麼一來,你就可以
非常真誠且真正相信
00:51
that this is the reason
that drives your choice,
14
51080
2936
這就是驅使你做出
這個選擇的背後理由,
00:54
but to me, it may still feel
like something is missing.
15
54040
2600
但對我而言,還是覺得少了什麼。
00:57
As it stands, due to
the nature of subjectivity,
16
57560
3176
在目前的條件下,
因為主觀性的本質,
01:00
it is actually very hard to ever prove
that people are wrong about themselves.
17
60760
4320
其實非常難去證明人們
對自己的看法是錯的。
01:06
So I'm an experimental psychologist,
18
66600
2136
我是一名實驗心理學家,
01:08
and this is the problem
we've been trying to solve in our lab.
19
68760
3536
這是我們在實驗室中
一直想解決的問題。
01:12
So we wanted to create an experiment
20
72320
2176
我們想要創造出一種實驗,
01:14
that would allow us to challenge
what people say about themselves,
21
74520
3536
讓我們來挑戰人們對自己的說詞,
01:18
regardless of how certain they may seem.
22
78080
2680
不論他們看起來有多肯定。
01:21
But tricking people
about their own mind is hard.
23
81960
2736
但要欺騙一個人關於
他自己大腦的事,是很困難的。
01:24
So we turned to the professionals.
24
84720
2376
所以我們轉向專業人士求助。
01:27
The magicians.
25
87120
1200
魔術師。
01:29
So they're experts at creating
the illusion of a free choice.
26
89120
2896
他們的專業就是創造出
有自由選擇權的幻覺。
01:32
So when they say, "Pick a card, any card,"
27
92040
2296
當他們說:「挑一張牌,
任何一張牌。」
01:34
the only thing you know
is that your choice is no longer free.
28
94360
2920
你唯一能知道的就是,
你的選擇已不是自由的。
01:38
So we had a few fantastic
brainstorming sessions
29
98200
2376
我們和一群瑞典的魔術師進行了
01:40
with a group of Swedish magicians,
30
100600
1856
幾次很棒的腦力激盪,
01:42
and they helped us create a method
31
102480
1642
他們協助我們創造了
01:44
in which we would be able to manipulate
the outcome of people's choices.
32
104147
3973
一種方式,讓我們能
操控別人的選擇結果。
01:48
This way we would know
when people are wrong about themselves,
33
108760
2936
這樣,當人們對自己的看法
有誤時,我們就會知道,
01:51
even if they don't know this themselves.
34
111720
2040
即使他們自己都不知道。
01:54
So I will now show you
a short movie showing this manipulation.
35
114480
4656
我現在要播放一段影片,
說明這種操控要如何進行。
01:59
So it's quite simple.
36
119160
1416
它相當簡單。
02:00
The participants make a choice,
37
120600
2136
受試者要做一個選擇,
02:02
but I end up giving them the opposite.
38
122760
2256
但我卻會給他們沒有選的那一個。
02:05
And then we want to see:
How did they react, and what did they say?
39
125040
3520
接著,我們想看看:
他們會如何反應、會說什麼?
02:09
So it's quite simple, but see
if you can spot the magic going on.
40
129240
3160
所以它很簡單,但試試
你能否看到有魔術在發生。
02:13
And this was shot with real participants,
they don't know what's going on.
41
133440
3520
這是在拍攝真實的受試者,
他們不知道會發生什麼事。
02:19
(Video) Petter Johansson:
Hi, my name's Petter.
42
139000
2216
(影片)佩特強納森:
嗨,我是佩特。
02:21
Woman: Hi, I'm Becka.
43
141240
1215
女子:嗨,我是貝卡。
02:22
PJ: I'm going to show you
pictures like this.
44
142479
2137
佩特:我會給你看像這樣的照片。
02:24
And you'll have to decide
which one you find more attractive.
45
144640
2896
你得要決定,你覺得
哪一張比較吸引人。
02:27
Becka: OK.
46
147560
1216
貝卡:好。
02:28
PJ: And then sometimes,
I will ask you why you prefer that face.
47
148800
3176
佩特:有時,我會問你,
你為什麼偏好那張臉。
02:32
Becka: OK.
48
152000
1216
貝卡:好。
02:33
PJ: Ready?
Becka: Yeah.
49
153240
1200
佩特:準備好了?
貝卡:好了。
02:43
PJ: Why did you prefer that one?
50
163120
1816
佩特:你為什麼比較喜歡那張臉?
02:44
Becka: The smile, I think.
51
164960
1496
貝卡:我想,是微笑。
02:46
PJ: Smile.
52
166480
1200
佩特:微笑。
02:52
Man: One on the left.
53
172400
1240
男子:左邊的。
02:57
Again, this one just struck me.
54
177520
1640
一樣,這張照片有觸到我的點。
02:59
Interesting shot.
55
179760
1616
有趣的拍攝鏡頭。
03:01
Since I'm a photographer,
I like the way it's lit and looks.
56
181400
3000
我是個攝影師,我喜歡它
打燈和看起來的感覺。
03:06
Petter Johansson: But now comes the trick.
57
186280
2040
佩特:但,現在來看一下騙局。
03:10
(Video) Woman 1: This one.
58
190120
1280
(影片)女子 1:這張。
03:16
PJ: So they get the opposite
of their choice.
59
196240
2280
佩特:他們拿到的照片
是他們沒選的那張。
03:20
And let's see what happens.
60
200520
1600
咱們來瞧瞧會發生什麼事。
03:28
Woman 2: Um ...
61
208240
1200
女子 2:呃…
03:35
I think he seems a little more
innocent than the other guy.
62
215760
2800
我覺得他看起來比另一個人無辜些。
03:45
Man: The one on the left.
63
225360
1240
男子:左邊的。
03:49
I like her smile
and contour of the nose and face.
64
229280
3696
我喜歡她的微笑,
還有鼻子和臉頰的輪廓。
03:53
So it's a little more interesting
to me, and her haircut.
65
233000
2760
所以我覺得這張比較有趣,
還有她的髮型。
04:00
Woman 3: This one.
66
240040
1200
女子 3:這張。
04:03
I like the smirky look better.
67
243520
1576
我比較喜歡嘻嘻笑的外表。
04:05
PJ: You like the smirky look better?
68
245120
2000
佩特:你比較喜歡嘻嘻笑的外表?
04:09
(Laughter)
69
249680
3176
(笑聲)
04:12
Woman 3: This one.
70
252880
1200
女子 3:這張。
04:15
PJ: What made you choose him?
71
255280
1400
佩特:你為什麼選他?
04:17
Woman 3: I don't know,
he looks a little bit like the Hobbit.
72
257520
2896
女子 3:我不知道,
他看起來有點像哈比人。
04:20
(Laughter)
73
260440
2056
(笑聲)
04:22
PJ: And what happens in the end
74
262520
1496
佩特:在實驗結束時,
04:24
when I tell them the true nature
of the experiment?
75
264040
3096
當我告訴他們這個實驗
真正在做什麼,會如何?
04:27
Yeah, that's it. I just have to
ask a few questions.
76
267160
2456
是的,就這樣。
我只需要問幾個問題。
04:29
Man: Sure.
77
269640
1216
男子:沒問題。
04:30
PJ: What did you think
of this experiment, was it easy or hard?
78
270880
2976
佩特:你覺得這個實驗如何,
容易或困難?
04:33
Man: It was easy.
79
273880
1240
男子:容易。
04:36
PJ: During the experiments,
80
276040
1336
佩特:在實驗過程中,
04:37
I actually switched
the pictures three times.
81
277400
3336
我其實把照片偷換了三次。
04:40
Was this anything you noticed?
82
280760
1576
你有注意到這點嗎?
04:42
Man: No. I didn't notice any of that.
83
282360
1816
男子:沒有,我沒注意到。
04:44
PJ: Not at all?
Man: No.
84
284200
1496
佩特:完全沒有?
男子:沒有。
04:45
Switching the pictures as far as ...
85
285720
2096
換照片的意思是……
04:47
PJ: Yeah, you were pointing at one of them
but I actually gave you the opposite.
86
287840
3816
佩特:是的,你指著其中一張照片,
但我其實給你的是另一張。
04:51
Man: The opposite one.
OK, when you --
87
291680
1816
男子:另一張。好,當你──
04:53
No. Shows you how much
my attention span was.
88
293520
2256
不。這展現我的注意力持續多長。
04:55
(Laughter)
89
295800
1520
(笑聲)
04:58
PJ: Did you notice that sometimes
during the experiment
90
298880
3016
佩特:你有注意到,在實驗過程中,
05:01
I switched the pictures?
91
301920
2136
我有時偷換了照片?
05:04
Woman 2: No, I did not notice that.
92
304080
2016
女子 2:沒有,我沒注意到。
05:06
PJ: You were pointing at one,
but then I gave you the other one.
93
306120
3000
佩特:你指著這一張照片時,
我接著會給你另一張。
05:09
No inclination of that happening?
94
309920
1616
不知道有發生這件事?
05:11
Woman 2: No.
95
311560
1576
女子 2:不知道。
05:13
Woman 2: I did not notice.
96
313160
1256
女子 2:我沒注意到。
05:14
(Laughs)
97
314440
1936
(笑聲)
05:16
PJ: Thank you.
98
316400
1216
佩特:謝謝你。
05:17
Woman 2: Thank you.
99
317640
1376
女子 2:謝謝你。
05:19
PJ: OK, so as you probably
figured out now,
100
319040
2056
(現場)佩特:好,
所以現在你們可能已經想通,
05:21
the trick is that I have
two cards in each hand,
101
321120
2256
技倆在於我每隻手上有兩張牌,
05:23
and when I hand one of them over,
102
323400
1576
當我把上面的牌移過去時,
05:25
the black one kind of disappears
into the black surface on the table.
103
325000
4360
因為桌子表面是黑的,所以
下面黑色的那張就像消失了一樣。
05:30
So using pictures like this,
104
330640
1736
用像這樣的照片,
05:32
normally not more than 20 percent
of the participants detect these tries.
105
332400
4376
通常不到 20% 的
受試者會發現有詐。
05:36
And as you saw in the movie,
106
336800
1416
如同在影片中看到的,
05:38
when in the end
we explain what's going on,
107
338240
3176
在最後我們會解釋發生了什麼事,
05:41
they're very surprised and often refuse
to believe the trick has been made.
108
341440
4376
他們會很驚訝,通常會拒絕
相信我有使用這個技倆。
05:45
So this shows that this effect
is quite robust and a genuine effect.
109
345840
4776
這表示,這種效應是
相當可靠且真實的效應。
05:50
But if you're interested
in self-knowledge, as I am,
110
350640
2656
但,如果你和我一樣,
對「自我知識」感興趣,
05:53
the more interesting bit is,
111
353320
1336
更有趣的部分是,
05:54
OK, so what did they say
when they explained these choices?
112
354680
3936
當他們在解釋他們的選擇時,
他們說了什麼?
05:58
So we've done a lot of analysis
113
358640
1496
我們做了很多分析,
06:00
of the verbal reports
in these experiments.
114
360160
2080
分析這些實驗中的口頭報告。
06:03
And this graph simply shows
115
363360
2456
這張圖顯示的是
06:05
that if you compare
what they say in a manipulated trial
116
365840
4776
如果你把他們在有詐的
那幾回當中的說詞,
拿來和沒詐的那幾回做比較,
06:10
with a nonmanipulated trial,
117
370640
1376
也就是他們解釋正常選擇時的說詞,
06:12
that is when they explain
a normal choice they've made
118
372040
2776
06:14
and one where we manipulated the outcome,
119
374840
2496
和我們在選擇結果動手腳之後
他們的說詞做比較,
06:17
we find that they are remarkably similar.
120
377360
2456
我們發現,說詞是非常像的。
06:19
So they are just as emotional,
just as specific,
121
379840
3056
這些說詞都一樣情緒化、一樣明確,
06:22
and they are expressed
with the same level of certainty.
122
382920
3200
而且是用相同的肯定度說出來的。
06:27
So the strong conclusion to draw from this
123
387120
2336
這實驗能導出一個強力的結論,
06:29
is that if there are no differences
124
389480
2216
如果在真正的選擇
06:31
between a real choice
and a manipulated choice,
125
391720
3696
和被操控的選擇之間沒有差異的話,
06:35
perhaps we make things up all the time.
126
395440
2440
也許我們隨時隨地都是在編理由。
06:38
But we've also done studies
127
398680
1336
但我們也有做些研究,
06:40
where we try to match what they say
with the actual faces.
128
400040
3016
試著把他們的說詞
和真實面孔來匹配。
06:43
And then we find things like this.
129
403080
1880
我們的發現是這樣的。
06:45
So here, this male participant,
he preferred the girl to the left,
130
405760
5056
這裡,這位男性受試者
偏好左邊的女子,
06:50
he ended up with the one to the right.
131
410840
1856
但他拿到的是右邊的照片。
06:52
And then, he explained
his choice like this.
132
412720
2816
接著,他是這樣解釋他的選擇。
06:55
"She is radiant.
133
415560
1296
「她容光煥發。
06:56
I would rather have approached her
at the bar than the other one.
134
416880
3096
在酒吧,我會比較想
接近她而不是其他人。
07:00
And I like earrings."
135
420000
1616
且我喜歡她的耳環。」
07:01
And whatever made him choose
the girl on the left to begin with,
136
421640
3496
不論一開始他是
為什麼選左邊的女子,
07:05
it can't have been the earrings,
137
425160
1576
絕對不會是因為耳環,
07:06
because they were actually
sitting on the girl on the right.
138
426760
2856
因為其實只有右邊的女子才有耳環。
07:09
So this is a clear example
of a post hoc construction.
139
429640
3776
這是個很清楚的例子,
說明了「事後建構」。
07:13
So they just explained
the choice afterwards.
140
433440
2800
他們是在事後才解釋他們的選擇。
07:17
So what this experiment shows is,
141
437320
2296
這個實驗所顯示的是,
07:19
OK, so if we fail to detect
that our choices have been changed,
142
439640
3656
如果我們沒能發現
我們的選擇被掉包了,
07:23
we will immediately start
to explain them in another way.
143
443320
3200
我們會馬上用另一種方式
來解釋我們的選擇。
07:27
And what we also found
144
447520
1256
我們也發現,
07:28
is that the participants
often come to prefer the alternative,
145
448800
3216
受試者會漸漸喜歡上另一個選擇,
他們被誤導以為
自己喜歡的那個選擇。
07:32
that they were led to believe they liked.
146
452040
2256
07:34
So if we let them do the choice again,
147
454320
2016
如果我們再讓他們選一次,
07:36
they will now choose the face
they had previously rejected.
148
456360
3760
他們現在會選的,
是他們先前沒選的那個。
07:41
So this is the effect
we call "choice blindness."
149
461520
2296
這個效應是所謂的「選擇盲目」。
07:43
And we've done
a number of different studies --
150
463840
2216
我們做了許多不同的研究──
07:46
we've tried consumer choices,
151
466080
2536
我們試過消費者選擇,
07:48
choices based on taste and smell
and even reasoning problems.
152
468640
4416
依據味覺和嗅覺做的選擇,
甚至試過推理問題。
07:53
But what you all want to know is of course
153
473080
2056
但,當然,你們都想知道的是,
07:55
does this extend also
to more complex, more meaningful choices?
154
475160
3936
這個現象也會延伸到更複雜、
更有意義的選擇上嗎?
07:59
Like those concerning
moral and political issues.
155
479120
3080
比如和道德以及政治有關的選擇?
08:04
So the next experiment,
it needs a little bit of a background.
156
484400
4216
接下來的實驗需要一點點背景說明。
08:08
So in Sweden, the political landscape
157
488640
4256
在瑞典,政治的狀況是
08:12
is dominated by a left-wing
and a right-wing coalition.
158
492920
3360
由左翼和右翼組的聯合政府在主導。
08:17
And the voters may move a little bit
between the parties within each coalition,
159
497720
4416
投票人可能會在每個聯盟中的
兩黨之間有一點點猶疑,
08:22
but there is very little movement
between the coalitions.
160
502160
2760
但對不同聯盟之間的選擇
就幾乎不會猶疑。
08:25
And before each elections,
161
505680
1976
在每次大選之前,
08:27
the newspapers and the polling institutes
162
507680
4216
報紙和民意調查機構
08:31
put together what they call
"an election compass"
163
511920
2616
會做出所謂的「選舉羅盤」,
08:34
which consists of a number
of dividing issues
164
514560
3336
它包含了數個很有區分性的議題,
08:37
that sort of separates the two coalitions.
165
517920
2336
那些議題可以把兩個聯盟給區別開。
08:40
Things like if tax on gasoline
should be increased
166
520280
3735
比如,汽油稅應該要提高,
08:44
or if the 13 months of paid parental leave
167
524039
4096
或是十三個月的育嬰假是否應該
08:48
should be split equally
between the two parents
168
528159
2496
應該平等分給父親和母親,
08:50
in order to increase gender equality.
169
530679
2721
來改善性別平權。
08:54
So, before the last Swedish election,
170
534840
2216
所以,在上次瑞典大選之前,
08:57
we created an election compass of our own.
171
537080
2600
我們做了我們自己的選舉羅盤。
09:00
So we walked up to people in the street
172
540480
2136
我們到街上找人,
09:02
and asked if they wanted
to do a quick political survey.
173
542640
3336
問他們是否願意做個
快速的政治調查。
09:06
So first we had them state
their voting intention
174
546000
2456
首先,我們請他們說出他們傾向於
09:08
between the two coalitions.
175
548480
1360
投票給兩個聯盟中的哪一個。
09:10
Then we asked them
to answer 12 of these questions.
176
550560
3776
接著我們請他們回答
十二個這樣的問題。
09:14
They would fill in their answers,
177
554360
1976
他們會填寫他們的答案,
09:16
and we would ask them to discuss,
178
556360
1616
接著我們會請他們討論,
09:18
so OK, why do you think
tax on gas should be increased?
179
558000
5496
好,那你為什麼認為
汽油稅應該要提高?
09:23
And we'd go through the questions.
180
563520
2096
我們把問題都問完。
09:25
Then we had a color coded template
181
565640
3896
接著,我們有個用顏色編碼的樣板,
09:29
that would allow us
to tally their overall score.
182
569560
2936
讓我們能計算他們的總分數。
09:32
So this person would have
one, two, three, four
183
572520
3456
這個人會有一、二、三、四、
09:36
five, six, seven, eight, nine
scores to the left,
184
576000
3296
五、六、七、八、九分都是靠左的,
09:39
so he would lean to the left, basically.
185
579320
2680
所以,基本上,他傾向左翼。
09:42
And in the end, we also had them
fill in their voting intention once more.
186
582800
4440
最後,我們會再次請他們
填寫他們的投票傾向。
09:48
But of course, there was
also a trick involved.
187
588160
2280
當然,我們耍了個小技倆。
09:51
So first, we walked up to people,
188
591360
2176
首先,我們走向路人,
09:53
we asked them
about their voting intention
189
593560
2056
我們問他們的投票傾向,
09:55
and then when they started filling in,
190
595640
2256
接著,當他們開始填寫時,
09:57
we would fill in a set of answers
going in the opposite direction.
191
597920
5456
我們會填寫一組相反的答案。
10:03
We would put it under the notepad.
192
603400
2576
我們把這張紙放在筆記本的下方。
10:06
And when we get the questionnaire,
193
606000
2776
當我們拿到問卷時,
10:08
we would simply glue it on top
of the participant's own answer.
194
608800
3320
我們就把它黏在受測者的答案上面。
10:16
So there, it's gone.
195
616000
1240
就這樣,它不見了。
10:24
And then we would ask
about each of the questions:
196
624280
2376
接著,我們會針對
每個問題再問他們:
10:26
How did you reason here?
197
626680
1536
你在這題的理由是什麼?
10:28
And they'll state the reasons,
198
628240
1736
他們會說明理由,
10:30
together we will sum up
their overall score.
199
630000
2480
我們會一起把總分加起來。
10:34
And in the end, they will state
their voting intention again.
200
634800
3680
最終,他們會再次陳述
他們的投票傾向。
10:41
So what we find first of all here,
201
641960
1656
首先,我們發現的是,
10:43
is that very few of these
manipulations are detected.
202
643640
4216
很少有人察覺到我們的技倆。
10:47
And they're not detected
in the sense that they realize,
203
647880
2656
意思是說,他們並沒有發現:
10:50
"OK, you must have changed my answer,"
204
650560
1856
「你一定有偷改我的答案。」
10:52
it was more the case that,
205
652440
1256
通常比較會是:
10:53
"OK, I must've misunderstood
the question the first time I read it.
206
653720
3176
「我一定是在第一次讀
問題時誤解了它的意思。
10:56
Can I please change it?"
207
656920
1240
我能改正嗎?」
10:59
And even if a few of these
manipulations were changed,
208
659080
5136
即使有少數我們操控的部分被改了,
總的來說大部分都還是被忽視了。
11:04
the overall majority was missed.
209
664240
2136
11:06
So we managed to switch 90 percent
of the participants' answers
210
666400
3656
所以受試者的答案有 90%
都被我們成功偷換掉了,
11:10
from left to right, right to left,
their overall profile.
211
670080
3160
整體的側寫上,
左翼換到右翼,右翼換到左翼。
11:14
And what happens then when
they are asked to motivate their choices?
212
674800
4400
當他們被問及為什麼要做
這個選擇時,會發生什麼事?
11:20
And here we find much more
interesting verbal reports
213
680160
3056
在這裡,我們得到的口頭報告,
11:23
than compared to the faces.
214
683240
2016
比之前面孔比較時的更有意思許多。
11:25
People say things like this,
and I'll read it to you.
215
685280
3360
人們會這樣回答,讓我讀給你們聽。
11:29
So, "Large-scale governmental surveillance
of email and internet traffic
216
689720
3736
「對電子郵件及網路流量的
大規模政府監控
11:33
ought to be permissible as means to combat
international crime and terrorism."
217
693480
4336
應該要被允許,做為對抗
國際犯罪和恐怖主義的手段。」
11:37
"So you agree to some extent
with this statement." "Yes."
218
697840
2716
「所以,你對這段陳述
算是認同。」「是的。」
11:40
"So how did you reason here?"
219
700580
1500
「你的理由是什麼?」
11:43
"Well, like, as it is so hard to get
at international crime and terrorism,
220
703600
4936
「嗯,因為國際犯罪
和恐怖主義很難處理,
11:48
I think there should be
those kinds of tools."
221
708560
2776
我認為應該要有這類的工具。」
11:51
And then the person remembers an argument
from the newspaper in the morning.
222
711360
3616
接著,這個人記起
在早報上的一段論述。
11:55
"Like in the newspaper today,
223
715000
1616
「就像今天的報紙寫的,
11:56
it said they can like,
listen to mobile phones from prison,
224
716640
3376
它說,他們能夠聽到
從監獄打的行動電話,
如果幫派首領試圖從監獄內
繼續他的犯罪就會被發現。
12:00
if a gang leader tries to continue
his crimes from inside.
225
720040
3536
12:03
And I think it's madness
that we have so little power
226
723600
2816
而我認為,如果我們沒有什麼力量
12:06
that we can't stop those things
227
726440
1656
能阻止這類事情,那就太瘋狂了,
12:08
when we actually have
the possibility to do so."
228
728120
2936
因為我們其實是有可能做到的。」
12:11
And then there's a little bit
back and forth in the end:
229
731080
2696
到最後,重申了一點:
12:13
"I don't like that they have access
to everything I do,
230
733800
2576
「我不喜歡他們能
知道我所做的任何事,
12:16
but I still think
it's worth it in the long run."
231
736400
2576
但我仍然認為長期來看是值得的。」
12:19
So, if you didn't know that this person
232
739000
2536
所以,如果你不知道這個人
12:21
just took part in
a choice blindness experiment,
233
741560
2256
剛剛參與了一項選擇盲目實驗,
12:23
I don't think you would question
234
743840
1856
我想你應該不會質疑
12:25
that this is the true attitude
of that person.
235
745720
3120
這是不是這個人的真實態度。
12:29
And what happens in the end,
with the voting intention?
236
749800
2856
最後的投票傾向又會發生什麼狀況?
12:32
What we find -- that one is also
clearly affected by the questionnaire.
237
752680
4696
我們發現──
人也會明顯受到問卷的影響。
12:37
So we have 10 participants
238
757400
1736
我們共有十名受試者
12:39
shifting from left to right
or from right to left.
239
759160
2976
從左翼變成右翼,
或從右翼變成左翼。
12:42
We have another 19
that go from clear voting intention
240
762160
2536
我們還有十九名受試者,
投票傾向從明確變成不確定。
12:44
to being uncertain.
241
764720
1456
12:46
Some go from being uncertain
to clear voting intention.
242
766200
3096
有些是從不確定變成明確。
12:49
And then there is a number of participants
staying uncertain throughout.
243
769320
4736
還有許多受試者
從頭到尾都一直不確定。
12:54
And that number is interesting
244
774080
1576
那個數字很有意思,
12:55
because if you look
at what the polling institutes say
245
775680
4616
因為如果你看民意調查機構的說法,
13:00
the closer you get to an election,
246
780320
1656
越接近大選時,
13:02
the only people that are sort of in play
247
782000
2136
唯一還會有影響力的人,
13:04
are the ones that are
considered uncertain.
248
784160
2656
就是被認為還不確定的人。
13:06
But we show there is a much larger number
249
786840
3216
但我們發現,有更多的人
13:10
that would actually consider
shifting their attitudes.
250
790080
2800
是確實會考慮轉變他們的態度的。
13:13
And here I must point out, of course,
that you are not allowed to use this
251
793640
3496
在這裡,我必須要指出,
當然你不能夠用這個方式
13:17
as an actual method
to change people's votes
252
797160
2616
來真正在選舉前去改變選民
13:19
before an election,
253
799800
1496
要投給誰,
13:21
and we clearly debriefed them afterwards
254
801320
3616
我們在事後有明確地跟他們做匯報,
13:24
and gave them every
opportunity to change back
255
804960
2296
給他們機會改回答案,
13:27
to whatever they thought first.
256
807280
2480
改回他們一開始的想法。
13:30
But what this shows is
that if you can get people
257
810600
2336
但這實驗發現的是,
如果你能讓人民去看另一方的觀點,
13:32
to see the opposite view and engage
in a conversation with themselves,
258
812960
5536
並讓他們和自己進行對話,
13:38
that could actually make them
change their views.
259
818520
2920
其實有可能改變他們的觀點。
13:42
OK.
260
822400
1200
好。
13:44
So what does it all mean?
261
824760
1656
所以這一切的意思是什麼?
13:46
What do I think is going on here?
262
826440
2416
我認為這裡發生了什麼事?
13:48
So first of all,
263
828880
1216
首先,
13:50
a lot of what we call self-knowledge
is actually self-interpretation.
264
830120
4856
我們所謂的自我知識,
其實大部分是自我詮釋。
13:55
So I see myself make a choice,
265
835000
2496
我看到我自己做了一個選擇,
13:57
and then when I'm asked why,
266
837520
2776
接著,當我被問到為什麼時,
14:00
I just try to make
as much sense of it as possible
267
840320
2536
我就是盡可能去做解釋
14:02
when I make an explanation.
268
842880
1936
來讓這個選擇合理化。
14:04
But we do this so quickly
and with such ease
269
844840
3016
但我們這麼做的過程既快又輕易,
14:07
that we think we actually know the answer
when we answer why.
270
847880
4280
誤以為自己知道「為什麼」的答案。
14:13
And as it is an interpretation,
271
853040
3096
既然它只是一種詮釋,
14:16
of course we sometimes make mistakes.
272
856160
2296
當然我們有可能詮釋錯誤,
14:18
The same way we make mistakes
when we try to understand other people.
273
858480
3520
就像我們試圖了解他人時發生誤解。
14:23
So beware when you ask people
the question "why"
274
863160
3696
所以,當你問別人「為什麼」
這個問題時,要很小心,
14:26
because what may happen
is that, if you asked them,
275
866880
4896
因為很有可能當你問他們為什麼,
14:31
"So why do you support this issue?"
276
871800
4016
「你為什麼支持這個議題?」
14:35
"Why do you stay in this job
or this relationship?" --
277
875840
3216
「你為什麼不換工作,
為什麼持續這段戀情?」──
14:39
what may happen when you ask why
is that you actually create an attitude
278
879080
3416
當你問為什麼時
很可能會造出一種態度,
14:42
that wasn't there
before you asked the question.
279
882520
2240
造出在你發問前不存在的態度。
14:45
And this is of course important
in your professional life, as well,
280
885440
3176
當然,這對你的職涯也很重要,
14:48
or it could be.
281
888640
1216
或可能很重要。
14:49
If, say, you design something
and then you ask people,
282
889880
2536
比如你設計一樣東西,接著問別人:
14:52
"Why do you think this is good or bad?"
283
892440
2256
「你為什麼覺得它很好或不好?」
14:54
Or if you're a journalist
asking a politician,
284
894720
3056
如果你是記者,去問政治人物:
14:57
"So, why did you make this decision?"
285
897800
2376
「你為什麼做這個決策?」
15:00
Or if indeed you are a politician
286
900200
1936
或如果你本身是政治人物,
15:02
and try to explain
why a certain decision was made.
287
902160
2640
試著要解釋為什麼會做出某個決策。
15:06
So this may all seem a bit disturbing.
288
906080
3576
這一切聽起來有點讓人不安。
15:09
But if you want to look at it
from a positive direction,
289
909680
3496
但如果從正面來看,
15:13
it could be seen as showing,
290
913200
1736
可以把它視為是展示出……
15:14
OK, so we're actually
a little bit more flexible than we think.
291
914960
3376
我們其實比自認的還更有彈性。
15:18
We can change our minds.
292
918360
1896
我們能改變心意。
15:20
Our attitudes are not set in stone.
293
920280
2456
我們的態度不是一成不變的。
15:22
And we can also change
the minds of others,
294
922760
3176
我們也能改變他人的心意,
15:25
if we can only get them
to engage with the issue
295
925960
2376
只要我們能讓他們
真正去了解那個問題,
15:28
and see it from the opposite view.
296
928360
1680
從另一個角度去看那問題。
15:31
And in my own personal life,
since starting with this research --
297
931400
3936
在我自己的生活中──
自從開始這項研究之後,
15:35
So my partner and I,
we've always had the rule
298
935360
2576
我的搭擋和我就一直遵守一條規則,
15:37
that you're allowed to take things back.
299
937960
2296
那就是:你可以反悔。
15:40
Just because I said
I liked something a year ago,
300
940280
2336
因為一年前我說過喜歡某樣東西,
15:42
doesn't mean I have to like it still.
301
942640
2040
並不表示我現在仍然得要喜歡它。
15:45
And getting rid of the need
to stay consistent
302
945480
2816
擺脫「需要維持一致性」的需求,
15:48
is actually a huge relief and makes
relational life so mush easier to live.
303
948320
4360
其實能讓人大大鬆一口氣,
也讓我們能夠輕鬆過人際的生活。
15:53
Anyway, so the conclusion must be:
304
953720
2360
總之,結論就是:
15:57
know that you don't know yourself.
305
957320
2496
要知道你並不了解自己,
15:59
Or at least not as well
as you think you do.
306
959840
2320
或是至少沒有你想像的那麼了解。
16:03
Thanks.
307
963480
1216
謝謝。
16:04
(Applause)
308
964720
4640
(掌聲)
New videos
Original video on YouTube.com
關於本網站
本網站將向您介紹對學習英語有用的 YouTube 視頻。 您將看到來自世界各地的一流教師教授的英語課程。 雙擊每個視頻頁面上顯示的英文字幕,從那裡播放視頻。 字幕與視頻播放同步滾動。 如果您有任何意見或要求,請使用此聯繫表與我們聯繫。