Do you really know why you do what you do? | Petter Johansson

193,440 views ・ 2018-03-27

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Milenka Okuka Lektor: Mile Živković
00:12
So why do you think the rich should pay more in taxes?
0
12800
3560
Dakle, zašto smatrate da bi bogati trebalo da plaćaju veći porez?
00:16
Why did you buy the latest iPhone?
1
16400
2376
Zašto ste kupili poslednji ajfon?
00:18
Why did you pick your current partner?
2
18800
2456
Zašto ste odabrali trenutnog partnera?
00:21
And why did so many people vote for Donald Trump?
3
21280
3416
I zašto je toliko ljudi glasalo za Donalda Trampa?
00:24
What were the reasons, why did they do it?
4
24720
2520
Koji su razlozi, zašto su to uradili?
00:27
So we ask this kind of question all the time,
5
27990
2106
Dakle, slična pitanja stalno postavljamo,
00:30
and we expect to get an answer.
6
30120
1736
i očekujemo da dobijemo odgovor.
00:31
And when being asked, we expect ourselves to know the answer,
7
31880
3136
A kada nas upitaju, očekujemo od sebe da znamo odgovor,
00:35
to simply tell why we did as we did.
8
35040
2480
da prosto kažemo zašto smo uradili kako smo uradili.
00:38
But do we really know why?
9
38440
1720
Međutim, da li zaista znamo zašto?
00:41
So when you say that you prefer George Clooney to Tom Hanks,
10
41000
3456
Dakle, kada kažete da vam se više sviđa Džordž Kluni od Toma Henksa,
00:44
due to his concern for the environment,
11
44479
2057
zbog njegove brige za okolinu,
00:46
is that really true?
12
46560
1200
da li je to zaista istina?
00:48
So you can be perfectly sincere and genuinely believe
13
48560
2496
Možete da budete krajnje iskreni i da istinski verujete
00:51
that this is the reason that drives your choice,
14
51080
2936
da je to razlog iza vašeg izbora,
00:54
but to me, it may still feel like something is missing.
15
54040
2600
ali meni se i dalje može činiti da nešto nedostaje.
00:57
As it stands, due to the nature of subjectivity,
16
57560
3176
Poznato je da je zbog prirode subjektivnosti
01:00
it is actually very hard to ever prove that people are wrong about themselves.
17
60760
4320
zapravo veoma teško uopšte dokazati da ljudi greše o sebi.
01:06
So I'm an experimental psychologist,
18
66600
2136
Dakle, ja sam eksperimentalni psiholog,
01:08
and this is the problem we've been trying to solve in our lab.
19
68760
3536
i ovo je problem koji smo pokušali da rešimo u laboratoriji.
01:12
So we wanted to create an experiment
20
72320
2176
Hteli smo da osmislimo eksperiment
01:14
that would allow us to challenge what people say about themselves,
21
74520
3536
koji bi nam omogućio da dovedemo u pitanje ono što ljudi govore o sebi,
01:18
regardless of how certain they may seem.
22
78080
2680
bez obzira na to koliko ubeđenim se činili.
01:21
But tricking people about their own mind is hard.
23
81960
2736
Međutim, teško je obmanuti ljude o njihovom sopstvenom umu.
01:24
So we turned to the professionals.
24
84720
2376
Pa smo se obratili profesionalcima.
01:27
The magicians.
25
87120
1200
Mađioničarima.
01:29
So they're experts at creating the illusion of a free choice.
26
89120
2896
Oni su stručnjaci za stvaranje iluzije slobodnog izbora.
01:32
So when they say, "Pick a card, any card,"
27
92040
2296
Pa, kad kažu: "Izaberite kartu, bilo koju kartu",
01:34
the only thing you know is that your choice is no longer free.
28
94360
2920
jedino što znate je da vaš izbor više nije slobodan.
01:38
So we had a few fantastic brainstorming sessions
29
98200
2376
Pa smo imali nekoliko sjajnih sesija grupnog mozganja
01:40
with a group of Swedish magicians,
30
100600
1856
sa ekipom švedskih mađioničara,
01:42
and they helped us create a method
31
102480
1642
i pomogli su nam da osmislimo metod
01:44
in which we would be able to manipulate the outcome of people's choices.
32
104147
3973
kojim bismo bili u stanju da manipulišemo rezultatima ljudskih izbora.
01:48
This way we would know when people are wrong about themselves,
33
108760
2936
Tako bismo znali kada ljudi greše o sebi,
01:51
even if they don't know this themselves.
34
111720
2040
čak i kad sami nisu svesni toga.
01:54
So I will now show you a short movie showing this manipulation.
35
114480
4656
Dakle, sada ću da vam pokažem kratak film koji prikazuje ovu manipulaciju.
01:59
So it's quite simple.
36
119160
1416
Krajnje je prosto.
02:00
The participants make a choice,
37
120600
2136
Učesnici prave izbor,
02:02
but I end up giving them the opposite.
38
122760
2256
ali ja im na kraju dam suprotno od toga.
02:05
And then we want to see: How did they react, and what did they say?
39
125040
3520
A zatim smo želeli da vidimo: kako su reagovali i šta su rekli?
02:09
So it's quite simple, but see if you can spot the magic going on.
40
129240
3160
Krajnje je prosto, no vidite da li možete da zapazite čaroliju na delu.
02:13
And this was shot with real participants, they don't know what's going on.
41
133440
3520
A ovo je snimljeno sa pravim učesnicima, oni ne znaju šta se dešava.
02:19
(Video) Petter Johansson: Hi, my name's Petter.
42
139000
2216
(Video) Peter Johanson: Zdravo, zovem se Peter.
02:21
Woman: Hi, I'm Becka.
43
141240
1215
Žena: Zdravo, ja sam Beka.
02:22
PJ: I'm going to show you pictures like this.
44
142479
2137
PJ: Pokazaću ti slike slične ovoj.
02:24
And you'll have to decide which one you find more attractive.
45
144640
2896
A ti moraš da odlučiš koja ti se više sviđa.
02:27
Becka: OK.
46
147560
1216
Beka: U redu.
02:28
PJ: And then sometimes, I will ask you why you prefer that face.
47
148800
3176
PJ: A zatim ću povremeno da te pitam zašto ti se to lice više sviđa.
02:32
Becka: OK.
48
152000
1216
Beka: U redu.
02:33
PJ: Ready? Becka: Yeah.
49
153240
1200
PJ: Spremna? Beka: Da.
02:43
PJ: Why did you prefer that one?
50
163120
1816
PJ: Zašto ti se taj više sviđa?
02:44
Becka: The smile, I think.
51
164960
1496
Beka: Zbog osmeha, mislim.
02:46
PJ: Smile.
52
166480
1200
PJ: Osmeh.
02:52
Man: One on the left.
53
172400
1240
Čovek: Ona sleva.
02:57
Again, this one just struck me.
54
177520
1640
Opet, prosto ostavlja jači utisak.
02:59
Interesting shot.
55
179760
1616
Zanimljiva fotografija.
03:01
Since I'm a photographer, I like the way it's lit and looks.
56
181400
3000
Kako sam fotograf, sviđa mi se osvetljenje i kako izgleda.
03:06
Petter Johansson: But now comes the trick.
57
186280
2040
Peter Johanson: Međutim, sledi trik.
03:10
(Video) Woman 1: This one.
58
190120
1280
(Video) Žena 1: Ova.
03:16
PJ: So they get the opposite of their choice.
59
196240
2280
PJ: Dakle, dobijaju suprotno od onog što su odabrali.
03:20
And let's see what happens.
60
200520
1600
I pogledajmo šta će da se desi.
03:28
Woman 2: Um ...
61
208240
1200
Žena 2: Hm...
03:35
I think he seems a little more innocent than the other guy.
62
215760
2800
Mislim da izgleda malo nevinije od drugog momka.
03:45
Man: The one on the left.
63
225360
1240
Čovek: Ona sleva.
03:49
I like her smile and contour of the nose and face.
64
229280
3696
Sviđa mi se njen osmeh i konture nosa i lica.
03:53
So it's a little more interesting to me, and her haircut.
65
233000
2760
Pa mi je malo zanimljivija, kao i njena frizura.
04:00
Woman 3: This one.
66
240040
1200
Žena 3: Ova.
04:03
I like the smirky look better.
67
243520
1576
Više mi se sviđa mangupski izgled.
04:05
PJ: You like the smirky look better?
68
245120
2000
PJ: Više ti se sviđa mangupski izgled.
04:09
(Laughter)
69
249680
3176
(Smeh)
04:12
Woman 3: This one.
70
252880
1200
Žena 3: Ova.
04:15
PJ: What made you choose him?
71
255280
1400
PJ: Zašto si odabrala njega?
04:17
Woman 3: I don't know, he looks a little bit like the Hobbit.
72
257520
2896
Žena 3: Ne znam, malo više liči na Hobita.
04:20
(Laughter)
73
260440
2056
(Smeh)
04:22
PJ: And what happens in the end
74
262520
1496
PJ: A šta se desi na kraju
04:24
when I tell them the true nature of the experiment?
75
264040
3096
kada im saopštim pravu prirodu eksperimenta.
04:27
Yeah, that's it. I just have to ask a few questions.
76
267160
2456
Da, to je to. Samo moram da vam postavim neka pitanja.
04:29
Man: Sure.
77
269640
1216
Čovek: Naravno.
04:30
PJ: What did you think of this experiment, was it easy or hard?
78
270880
2976
PJ: Šta msilite o ovom eksperimentu: da li je bio lak ili težak?
04:33
Man: It was easy.
79
273880
1240
Čovek: Bio je lak.
04:36
PJ: During the experiments,
80
276040
1336
PJ: Tokom eksperimenta,
04:37
I actually switched the pictures three times.
81
277400
3336
zapravo sam zamenio slike tri puta.
04:40
Was this anything you noticed?
82
280760
1576
Da li je to nešto što ste primetili?
04:42
Man: No. I didn't notice any of that.
83
282360
1816
Čovek: Ne. Nisam primetio ništa od toga.
04:44
PJ: Not at all? Man: No.
84
284200
1496
PJ: Uopšte? Čovek: Ne.
04:45
Switching the pictures as far as ...
85
285720
2096
Zamenili ste slike kao...
04:47
PJ: Yeah, you were pointing at one of them but I actually gave you the opposite.
86
287840
3816
PJ: Da, pokazivali ste na jednu, ali ja sam vam dao suprotnu.
04:51
Man: The opposite one. OK, when you --
87
291680
1816
Čovek: Suprotnu. U redu, kada ste -
04:53
No. Shows you how much my attention span was.
88
293520
2256
Ne. Pokazuje vam koliki mi je raspon pažnje.
04:55
(Laughter)
89
295800
1520
(Smeh)
04:58
PJ: Did you notice that sometimes during the experiment
90
298880
3016
PJ: Da li si primetila da sam povremeno tokom eksperimenta
05:01
I switched the pictures?
91
301920
2136
zamenio slike?
05:04
Woman 2: No, I did not notice that.
92
304080
2016
Žena 2: Ne, nisam to primetila.
05:06
PJ: You were pointing at one, but then I gave you the other one.
93
306120
3000
PJ: Pokazivala si jednu, ali ja sam ti dao drugu.
05:09
No inclination of that happening?
94
309920
1616
Nemaš naznaku da se to dogodilo?
05:11
Woman 2: No.
95
311560
1576
Žena 2: Ne.
05:13
Woman 2: I did not notice.
96
313160
1256
Žena 2: Nisam primetila.
05:14
(Laughs)
97
314440
1936
(Smeje se)
05:16
PJ: Thank you.
98
316400
1216
PJ: Hvala.
05:17
Woman 2: Thank you.
99
317640
1376
Žena: Hvala.
PJ: U redu, verovatno ste shvatili do sad,
05:19
PJ: OK, so as you probably figured out now,
100
319040
2056
05:21
the trick is that I have two cards in each hand,
101
321120
2256
trik je da držim dve karte u svakoj ruci,
05:23
and when I hand one of them over,
102
323400
1576
a kada dodam jednu od njih,
05:25
the black one kind of disappears into the black surface on the table.
103
325000
4360
crna se nekako izgubi na crnoj površini stola.
05:30
So using pictures like this,
104
330640
1736
Dakle, upotrebom sličnih slika,
05:32
normally not more than 20 percent of the participants detect these tries.
105
332400
4376
obično ne više od 20% učesnika zapazi ovaj opit.
05:36
And as you saw in the movie,
106
336800
1416
I kao što ste videli na filmu,
05:38
when in the end we explain what's going on,
107
338240
3176
kada na kraju objasnimo šta se dešavalo,
05:41
they're very surprised and often refuse to believe the trick has been made.
108
341440
4376
veoma su iznenađeni i često odbijaju da veruju da je došlo do trika.
05:45
So this shows that this effect is quite robust and a genuine effect.
109
345840
4776
Dakle, ovo pokazuje da je dati efekat prilično snažan i autentičan.
05:50
But if you're interested in self-knowledge, as I am,
110
350640
2656
No, ako ste zainteresovani za samospoznaju kao ja,
05:53
the more interesting bit is,
111
353320
1336
zanimljiviji je deo,
05:54
OK, so what did they say when they explained these choices?
112
354680
3936
u redu, pa šta su rekli kada su objašnjavali ove izbore?
05:58
So we've done a lot of analysis
113
358640
1496
Pa smo uradili mnogo analiza
06:00
of the verbal reports in these experiments.
114
360160
2080
verbalnih izveštaja tokom ovih eksperimenata.
06:03
And this graph simply shows
115
363360
2456
A ovaj grafikon prosto pokazuje
06:05
that if you compare what they say in a manipulated trial
116
365840
4776
da ako uporedite šta kažu u manipulisanom opitu
06:10
with a nonmanipulated trial,
117
370640
1376
sa nemanipulisanim opitom,
06:12
that is when they explain a normal choice they've made
118
372040
2776
to jest kad objašnjavaju normalan izbor koji su napravili
06:14
and one where we manipulated the outcome,
119
374840
2496
i onaj u kom smo izmanipulisali rezultat,
06:17
we find that they are remarkably similar.
120
377360
2456
otkrili smo da su zapanjujuće slični.
06:19
So they are just as emotional, just as specific,
121
379840
3056
Dakle, jednako su emotivni, jednako specifični
06:22
and they are expressed with the same level of certainty.
122
382920
3200
i izraženi su istim nivoom ubeđenosti.
06:27
So the strong conclusion to draw from this
123
387120
2336
Pa je snažan zaključak koji smo izvukli iz ovoga
06:29
is that if there are no differences
124
389480
2216
to da ako nema razlike
06:31
between a real choice and a manipulated choice,
125
391720
3696
između stvarnog izbora i izmanipulisanog izbora,
06:35
perhaps we make things up all the time.
126
395440
2440
onda možda stalno izmišljamo stvari.
06:38
But we've also done studies
127
398680
1336
No radili smo i istraživanja
06:40
where we try to match what they say with the actual faces.
128
400040
3016
gde smo pokušali da povežemo to što su rekli sa stvarnim licima.
06:43
And then we find things like this.
129
403080
1880
I tad smo otkrili nešto slično ovome.
06:45
So here, this male participant, he preferred the girl to the left,
130
405760
5056
Dakle, ovde, muškom učesniku se više svidela devojka sleva,
06:50
he ended up with the one to the right.
131
410840
1856
završio je s onom zdesna.
06:52
And then, he explained his choice like this.
132
412720
2816
A onda je objasnio svoj izbor ovako.
06:55
"She is radiant.
133
415560
1296
"Ona zrači.
06:56
I would rather have approached her at the bar than the other one.
134
416880
3096
Pre bih prišao njoj u baru nego onoj drugoj.
07:00
And I like earrings."
135
420000
1616
I sviđaju mi se minđuše."
07:01
And whatever made him choose the girl on the left to begin with,
136
421640
3496
A zbog čega god da je odabrao devojku sleva u početku,
07:05
it can't have been the earrings,
137
425160
1576
nije moglo da bude zbog minđuša
07:06
because they were actually sitting on the girl on the right.
138
426760
2856
jer su one zapravo bile na devojci zdesna.
07:09
So this is a clear example of a post hoc construction.
139
429640
3776
Dakle, ovo je jasan primer post hoc konstrukcije.
07:13
So they just explained the choice afterwards.
140
433440
2800
Dakle, prosto su naknadno objasnili izbor.
07:17
So what this experiment shows is,
141
437320
2296
Dakle, šta nam ovaj eksperiment pokazuje,
07:19
OK, so if we fail to detect that our choices have been changed,
142
439640
3656
u redu, ako nismo u stanju da primetimo da su se naši izbori promenili,
07:23
we will immediately start to explain them in another way.
143
443320
3200
istog trena ćemo početi da ih objašnjavamo na drugi način.
07:27
And what we also found
144
447520
1256
A takođe smo otkrili
07:28
is that the participants often come to prefer the alternative,
145
448800
3216
da se učesnicima često posle više sviđa alternativa,
07:32
that they were led to believe they liked.
146
452040
2256
na koju smo ih naveli da im se sviđa.
07:34
So if we let them do the choice again,
147
454320
2016
Pa, ako im omogućimo da ponovo biraju,
07:36
they will now choose the face they had previously rejected.
148
456360
3760
odabraće lice koje su prethodno odbacili.
07:41
So this is the effect we call "choice blindness."
149
461520
2296
Ovo je efekat koji nazivamo "zaslepljenost izborom".
07:43
And we've done a number of different studies --
150
463840
2216
I uradili smo brojna različita istraživanja -
07:46
we've tried consumer choices,
151
466080
2536
isprobali smo izbore potrošača,
07:48
choices based on taste and smell and even reasoning problems.
152
468640
4416
izbore zasnovane na ukusu i mirisu, pa čak i probleme rezonovanja.
07:53
But what you all want to know is of course
153
473080
2056
Međutim, svi vi naravno želite da znate
07:55
does this extend also to more complex, more meaningful choices?
154
475160
3936
da li se ovo odnosi na složenije, smislenije izbore?
07:59
Like those concerning moral and political issues.
155
479120
3080
Poput onih koji se odnose na moralna i politička pitanja.
08:04
So the next experiment, it needs a little bit of a background.
156
484400
4216
Dakle, sledeći eksperiment, potreban je kratak uvod za njega.
08:08
So in Sweden, the political landscape
157
488640
4256
Dakle, u Švedskoj političkom scenom
08:12
is dominated by a left-wing and a right-wing coalition.
158
492920
3360
dominiraju koalicije levog i desnog krila.
08:17
And the voters may move a little bit between the parties within each coalition,
159
497720
4416
A glasači mogu pomalo prelaziti između partija unutar svake koalicije,
08:22
but there is very little movement between the coalitions.
160
502160
2760
ali veoma malo ima prelazaka između koalicija.
08:25
And before each elections,
161
505680
1976
A pre svakih izbora,
08:27
the newspapers and the polling institutes
162
507680
4216
novine i instituti za anketiranje
08:31
put together what they call "an election compass"
163
511920
2616
sastavljaju nešto što se zove "izborni komaps",
08:34
which consists of a number of dividing issues
164
514560
3336
koji se sastoji od niza pitanja koja izazivaju razdor,
08:37
that sort of separates the two coalitions.
165
517920
2336
koja na neki način razdvajaju dve koalicije.
08:40
Things like if tax on gasoline should be increased
166
520280
3735
Stvari poput da li treba uvećati porez na benzin
08:44
or if the 13 months of paid parental leave
167
524039
4096
ili da li bi 13 meseci porodiljskog
08:48
should be split equally between the two parents
168
528159
2496
trebalo podeliti ravnomerno između dva roditelja
08:50
in order to increase gender equality.
169
530679
2721
kako bi se uvećala rodna ravnopravnost.
08:54
So, before the last Swedish election,
170
534840
2216
Dakle, pre poslednjih švedskih izbora,
08:57
we created an election compass of our own.
171
537080
2600
napravili smo sopstveni izborni kompas.
09:00
So we walked up to people in the street
172
540480
2136
Pa smo prilazili ljudima na ulici
09:02
and asked if they wanted to do a quick political survey.
173
542640
3336
i pitali ih da li žele da ispune kratku političku anketu.
Pa smo im prvo tražili da se izjasne kako će da glasaju
09:06
So first we had them state their voting intention
174
546000
2456
09:08
between the two coalitions.
175
548480
1360
između dve koalicije.
09:10
Then we asked them to answer 12 of these questions.
176
550560
3776
Potom smo im tražili da odgovore na sledećih 12 pitanja.
09:14
They would fill in their answers,
177
554360
1976
Ispisali bi svoje odgovore,
09:16
and we would ask them to discuss,
178
556360
1616
a mi bismo tražili da diskutuju,
09:18
so OK, why do you think tax on gas should be increased?
179
558000
5496
dakle, u redu, zašto smatrate da bi trebalo uvećati porez na benzin?
09:23
And we'd go through the questions.
180
563520
2096
I prolazili bismo kroz pitanja.
09:25
Then we had a color coded template
181
565640
3896
Potom smo imali obojeni šablon
09:29
that would allow us to tally their overall score.
182
569560
2936
koji nam je omogućavao da izračunamo njihov ukupan zbir.
09:32
So this person would have one, two, three, four
183
572520
3456
Dakle, ova osoba je imala jedan, dva, tri, četiri,
09:36
five, six, seven, eight, nine scores to the left,
184
576000
3296
pet, šest, sedam, osam, devet poena levo,
09:39
so he would lean to the left, basically.
185
579320
2680
dakle on u suštini naginje levo.
09:42
And in the end, we also had them fill in their voting intention once more.
186
582800
4440
I na kraju smo im takođe ponovo tražili da napišu za koga nameravaju da glasaju.
09:48
But of course, there was also a trick involved.
187
588160
2280
Međutim, naravno tu smo takođe imali trik.
09:51
So first, we walked up to people,
188
591360
2176
Dakle, prvo bismo prilazili ljudima,
09:53
we asked them about their voting intention
189
593560
2056
pitali ih o glasačkim namerama,
09:55
and then when they started filling in,
190
595640
2256
a potom kad bi počeli da odgovoraju,
09:57
we would fill in a set of answers going in the opposite direction.
191
597920
5456
mi bismo popunili niz odgovora u suprotnom smeru.
10:03
We would put it under the notepad.
192
603400
2576
Stavili bismo ih ispod notesa.
10:06
And when we get the questionnaire,
193
606000
2776
A kada bismo dobili upitnik,
10:08
we would simply glue it on top of the participant's own answer.
194
608800
3320
prosto bismo zalepili naše odgovore preko odgovora učesnika.
10:16
So there, it's gone.
195
616000
1240
Dakle, eto, nema ga više.
10:24
And then we would ask about each of the questions:
196
624280
2376
A potom bismo pitali o svakom pitanju:
10:26
How did you reason here?
197
626680
1536
kako ste rezonovali ovde?
10:28
And they'll state the reasons,
198
628240
1736
A oni bi iznosili razloge,
10:30
together we will sum up their overall score.
199
630000
2480
zajedno bismo sabirali njihov ukupan zbir.
10:34
And in the end, they will state their voting intention again.
200
634800
3680
A potom bi na kraju ponovo iznosli glasačke namere.
10:41
So what we find first of all here,
201
641960
1656
Dakle, ovde smo pre svega otkrili
10:43
is that very few of these manipulations are detected.
202
643640
4216
da je veoma mali broj ovih manipulacija primećeno.
10:47
And they're not detected in the sense that they realize,
203
647880
2656
A nisu primećene u smislu da su oni shvatali:
10:50
"OK, you must have changed my answer,"
204
650560
1856
"U redu, mora da ste mi izmenili odgovor",
10:52
it was more the case that,
205
652440
1256
više se radilo o slučaju:
10:53
"OK, I must've misunderstood the question the first time I read it.
206
653720
3176
"U redu, mora da sam pogrešno razumeo pitanje prvi put kad sam čitao.
10:56
Can I please change it?"
207
656920
1240
Mogu li, molim vas, da ga izmenim?"
10:59
And even if a few of these manipulations were changed,
208
659080
5136
Čak iako je nekoliko ovih manipulacija izmenjeno,
11:04
the overall majority was missed.
209
664240
2136
velika većina nije.
11:06
So we managed to switch 90 percent of the participants' answers
210
666400
3656
Pa smo uspeli da prebacimo 90 procenata odgovora učesnika
11:10
from left to right, right to left, their overall profile.
211
670080
3160
od levo ka desno, od desno ka levo; njihov celokupan profil.
11:14
And what happens then when they are asked to motivate their choices?
212
674800
4400
A šta se desilo potom kada smo im tražili da opravdaju svoje izbore?
11:20
And here we find much more interesting verbal reports
213
680160
3056
I tu smo otkrili daleko zanimljivije verbalne izveštaje
11:23
than compared to the faces.
214
683240
2016
nego u poređenju sa licima.
11:25
People say things like this, and I'll read it to you.
215
685280
3360
Ljudi govore nešto poput ovoga, pročitaću vam.
11:29
So, "Large-scale governmental surveillance of email and internet traffic
216
689720
3736
Dakle: "Opsežan vladin nadzor imejlova i internet sobraćaja
11:33
ought to be permissible as means to combat international crime and terrorism."
217
693480
4336
bi trebalo omogućiti kao sredstvo borbe protiv međunardnog kriminala i terorizma."
11:37
"So you agree to some extent with this statement." "Yes."
218
697840
2716
"Dakle, slažete se u određenoj meri sa izjavom." "Da."
11:40
"So how did you reason here?"
219
700580
1500
"Dakle, kako ste rezonovali ovde?"
11:43
"Well, like, as it is so hard to get at international crime and terrorism,
220
703600
4936
"Pa, kao, pošto je teško izboriti se sa međunarodnim kriminalom i terorizmom,
11:48
I think there should be those kinds of tools."
221
708560
2776
smatram da bi slična oruđa trebalo da postoje."
11:51
And then the person remembers an argument from the newspaper in the morning.
222
711360
3616
A potom bi se osoba setila argumenta iz jutrošnjih novina.
11:55
"Like in the newspaper today,
223
715000
1616
"Kao u današnjim novinama
11:56
it said they can like, listen to mobile phones from prison,
224
716640
3376
kažu da kao mogu da prisluškuju mobilne telefone u zatvoru,
12:00
if a gang leader tries to continue his crimes from inside.
225
720040
3536
ukoliko vođa bande pokuša da nastavi svoj zločin iznutra.
12:03
And I think it's madness that we have so little power
226
723600
2816
I smatram da je nenormalno da imamo tako malo moći
12:06
that we can't stop those things
227
726440
1656
da sprečimo slične pojave,
12:08
when we actually have the possibility to do so."
228
728120
2936
kada zapravo imamo mogućnost da to uradimo."
12:11
And then there's a little bit back and forth in the end:
229
731080
2696
A potom bude tu oklevanja na kraju:
12:13
"I don't like that they have access to everything I do,
230
733800
2576
"Ne sviđa mi se što imaju pristup svemu što radim,
12:16
but I still think it's worth it in the long run."
231
736400
2576
ali i dalje smatram da se isplati na duže staze."
12:19
So, if you didn't know that this person
232
739000
2536
Dakle, ukoliko ne biste znali da je ova osoba
12:21
just took part in a choice blindness experiment,
233
741560
2256
upravo učestvovala u opitu zaslepljenosti izborom,
12:23
I don't think you would question
234
743840
1856
mislim da ne biste dovodili u pitanje
12:25
that this is the true attitude of that person.
235
745720
3120
to da je ovo istinski stav date osobe.
12:29
And what happens in the end, with the voting intention?
236
749800
2856
I šta se na kraju desi sa glasačkim namerama?
12:32
What we find -- that one is also clearly affected by the questionnaire.
237
752680
4696
Otkrili smo - da je i na to upitnik jasno uticao.
12:37
So we have 10 participants
238
757400
1736
Dakle, imamo 10 učesnika
12:39
shifting from left to right or from right to left.
239
759160
2976
koji su se prebacili sleva ka desno ili zdesna ka levo.
12:42
We have another 19 that go from clear voting intention
240
762160
2536
Imamo još 19 koji su prešli iz jasne glasačke namere
12:44
to being uncertain.
241
764720
1456
u neopredeljenost.
12:46
Some go from being uncertain to clear voting intention.
242
766200
3096
Neki pređu iz neopredeljenosti u jasnu glasačku nameru.
12:49
And then there is a number of participants staying uncertain throughout.
243
769320
4736
A potom tu je niz učesnika koji ostaju neopredeljeni sve vreme.
12:54
And that number is interesting
244
774080
1576
A taj broj je zanimljiv
12:55
because if you look at what the polling institutes say
245
775680
4616
jer ako pogledate šta anketni instituti govore
13:00
the closer you get to an election,
246
780320
1656
što se više bližite izborima,
13:02
the only people that are sort of in play
247
782000
2136
jedini ljudi koji su nekako u igri
13:04
are the ones that are considered uncertain.
248
784160
2656
su oni koje smatraju neopredeljenim.
13:06
But we show there is a much larger number
249
786840
3216
Ali mi pokazujemo da je mnogo veći broj
13:10
that would actually consider shifting their attitudes.
250
790080
2800
onih koji bi razmatrali izmenu svojih stavova.
13:13
And here I must point out, of course, that you are not allowed to use this
251
793640
3496
I ovde moram da istaknem, naravno, da vam nije dozvoljeno da ovo koristite
13:17
as an actual method to change people's votes
252
797160
2616
kao stvarni metod izmene glasova ljudi
13:19
before an election,
253
799800
1496
pre izbora,
13:21
and we clearly debriefed them afterwards
254
801320
3616
i jasno smo ih kasnije obavestili
13:24
and gave them every opportunity to change back
255
804960
2296
i dali im puno pravo da se vrate
13:27
to whatever they thought first.
256
807280
2480
na prvobitno razmišljanje.
13:30
But what this shows is that if you can get people
257
810600
2336
Međutim, ovo pokazuje da ako uspete da navedete ljude
13:32
to see the opposite view and engage in a conversation with themselves,
258
812960
5536
da sagledaju suprotan stav i da se uključe u razgovor sa sobom
13:38
that could actually make them change their views.
259
818520
2920
da ih tako zapravo možete navesti da izmene svoje stavove.
13:42
OK.
260
822400
1200
U redu.
13:44
So what does it all mean?
261
824760
1656
Šta sve ovo znači?
13:46
What do I think is going on here?
262
826440
2416
Šta ja mislim da se ovde dešava?
13:48
So first of all,
263
828880
1216
Dakle, pre svega,
13:50
a lot of what we call self-knowledge is actually self-interpretation.
264
830120
4856
mnogo toga što nazivamo samospoznajom je zapravo interpretacija o sebi.
13:55
So I see myself make a choice,
265
835000
2496
Dakle, uhvatim sebe kako pravim izbor
13:57
and then when I'm asked why,
266
837520
2776
a kad me potom upitaju zašto,
14:00
I just try to make as much sense of it as possible
267
840320
2536
prosto pokušavam da mu dam što je više smisla moguće
14:02
when I make an explanation.
268
842880
1936
prilikom obrazlaganja.
14:04
But we do this so quickly and with such ease
269
844840
3016
Međutim, radimo to toliko brzo i tolikom lakoćom
14:07
that we think we actually know the answer when we answer why.
270
847880
4280
da smatramo kako zapravo znamo odgovor kada nas upitaju zašto.
14:13
And as it is an interpretation,
271
853040
3096
A kako se radi o interpretaciji,
14:16
of course we sometimes make mistakes.
272
856160
2296
naravno da ponekad grešimo.
14:18
The same way we make mistakes when we try to understand other people.
273
858480
3520
Na isti način kao što grešimo kada pokušavamo da razumemo druge ljude.
14:23
So beware when you ask people the question "why"
274
863160
3696
Dakle, pazite kada ljudima postavljate pitanje "zašto"
14:26
because what may happen is that, if you asked them,
275
866880
4896
jer može da se desi, ako ih upitate:
14:31
"So why do you support this issue?"
276
871800
4016
"Dakle, zašto podržavate ovo pitanje?"
14:35
"Why do you stay in this job or this relationship?" --
277
875840
3216
"Zašto ostajete na ovom poslu ili u ovoj vezi?" -
14:39
what may happen when you ask why is that you actually create an attitude
278
879080
3416
ono što se može desiti kada upitate zašto je da zapravo kreirate stav
14:42
that wasn't there before you asked the question.
279
882520
2240
koji nije postojao pre nego što ste postavili pitanje.
14:45
And this is of course important in your professional life, as well,
280
885440
3176
A ovo je naravno važno i u vašem profesionalnom životu,
14:48
or it could be.
281
888640
1216
ili bi moglo da bude.
14:49
If, say, you design something and then you ask people,
282
889880
2536
Ako, recimo, dizajnirate nešto i potom upitate ljude:
14:52
"Why do you think this is good or bad?"
283
892440
2256
"Zašto ovo smatrate dobrim ili lošim?"
14:54
Or if you're a journalist asking a politician,
284
894720
3056
Ili ako ste novinar koji postavlja pitanje političaru:
14:57
"So, why did you make this decision?"
285
897800
2376
"Dakle, zašto ste doneli ovu odluku?"
15:00
Or if indeed you are a politician
286
900200
1936
Ili ako ste uistinu političar
15:02
and try to explain why a certain decision was made.
287
902160
2640
i pokušavate da objasnite zašto je donešena neka odluka.
15:06
So this may all seem a bit disturbing.
288
906080
3576
Dakle, sve ovo može da se čini malčice uznemirujuće.
15:09
But if you want to look at it from a positive direction,
289
909680
3496
Ali ako želite da ovo posmatrate iz pozitivnog ugla,
15:13
it could be seen as showing,
290
913200
1736
može se činiti kao da otkriva,
15:14
OK, so we're actually a little bit more flexible than we think.
291
914960
3376
u redu, dakle, svi smo mi zapravo malo fleksibilniji nego što mislimo.
15:18
We can change our minds.
292
918360
1896
Možemo da promenimo mišljenje.
15:20
Our attitudes are not set in stone.
293
920280
2456
Naši stavovi nisu zacementirani.
15:22
And we can also change the minds of others,
294
922760
3176
A možemo i da promenimo mišljenja drugih,
15:25
if we can only get them to engage with the issue
295
925960
2376
ako ih samo uspemo navesti da se unesu u pitanje
15:28
and see it from the opposite view.
296
928360
1680
i da ga sagledaju iz suprotnog ugla.
15:31
And in my own personal life, since starting with this research --
297
931400
3936
A u mom ličnom životu, otkad sam započeo ovo istraživanje -
15:35
So my partner and I, we've always had the rule
298
935360
2576
Dakle, moj partner i ja, oduvek smo imali pravilo
15:37
that you're allowed to take things back.
299
937960
2296
da možete da povučete izrečeno.
15:40
Just because I said I liked something a year ago,
300
940280
2336
Samo zato što sam prošle godine rekao da mi se nešto sviđa,
15:42
doesn't mean I have to like it still.
301
942640
2040
ne mora da znači da mi se i dalje sviđa.
15:45
And getting rid of the need to stay consistent
302
945480
2816
A oslobađanje od potrebe da ostanete dosledni
15:48
is actually a huge relief and makes relational life so mush easier to live.
303
948320
4360
je zapravo ogromno olakšanje zbog koga je zajednički život lakši.
15:53
Anyway, so the conclusion must be:
304
953720
2360
Kako god, dakle, zaključak mora da bude:
15:57
know that you don't know yourself.
305
957320
2496
znajte da ne poznajete sebe.
15:59
Or at least not as well as you think you do.
306
959840
2320
Ili bar ne tako dobro kao što mislite da poznajete.
16:03
Thanks.
307
963480
1216
Hvala.
16:04
(Applause)
308
964720
4640
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7