How we read each other's minds | Rebecca Saxe

560,652 views ・ 2009-09-11

TED


Please double-click on the English subtitles below to play the video.

Prevodilac: Ana Stepanenko Lektor: Ana Zivanovic-Nenadovic
00:12
Today I'm going to talk to you about the problem of other minds.
0
12160
3000
Danas ću vam pričati o problemu drugih umova.
00:15
And the problem I'm going to talk about
1
15160
2000
I ovaj problem o kome ću pričati
00:17
is not the familiar one from philosophy,
2
17160
3000
nije onaj koji nam je poznat iz filozofije,
00:20
which is, "How can we know
3
20160
2000
„Kako možemo znati
00:22
whether other people have minds?"
4
22160
2000
da drugi ljudi imaju umove?“
00:24
That is, maybe you have a mind,
5
24160
2000
Dakle, možda vi imate um,
00:26
and everyone else is just a really convincing robot.
6
26160
3000
a svi ostali su samo jako ubedljivi roboti.
00:29
So that's a problem in philosophy,
7
29160
2000
To je filozofski problem.
00:31
but for today's purposes I'm going to assume
8
31160
2000
Ali u svrhu današnjeg izlaganja pretpostaviću
00:33
that many people in this audience have a mind,
9
33160
2000
da mnogi ovde u publici imaju um
00:35
and that I don't have to worry about this.
10
35160
2000
i da o tome ne moram da brinem.
00:37
There is a second problem that is maybe even more familiar to us
11
37160
3000
Tu je i drugi problem, koji nam je možda i poznatiji
00:40
as parents and teachers and spouses
12
40160
3000
kao roditeljima, učiteljima, supružnicima
00:43
and novelists,
13
43160
2000
i piscima.
00:45
which is, "Why is it so hard
14
45160
2000
A to je „ Zašto je tako teško
00:47
to know what somebody else wants or believes?"
15
47160
2000
znati šta neko drugi želi ili u šta veruje?“.
00:49
Or perhaps, more relevantly,
16
49160
2000
Ili, možda značajnije,
00:51
"Why is it so hard to change what somebody else wants or believes?"
17
51160
3000
„Zašto je tako teško promeniti to što neko drugi želi ili u šta veruje?“
00:54
I think novelists put this best.
18
54160
2000
Mislim da pisci to najbolje formulišu.
00:56
Like Philip Roth, who said,
19
56160
2000
Recimo Filip Rot, koji je rekao,
00:58
"And yet, what are we to do about this terribly significant business
20
58160
3000
A opet, šta nam je činiti sa onim što je užasno važno,
01:01
of other people?
21
61160
2000
a što su drugi ljudi?
01:03
So ill equipped are we all,
22
63160
2000
Svi mi tako smo slabo spremni
01:05
to envision one another's interior workings
23
65160
2000
da zamislimo unutrašnja previranja
01:07
and invisible aims."
24
67160
2000
i nevidljive ciljeve jedni kod drugih."
01:09
So as a teacher and as a spouse,
25
69160
3000
Ja se kao predavač i kao supruga,
01:12
this is, of course, a problem I confront every day.
26
72160
2000
naravno, svakodnevno srećem sa ovim problemom.
01:14
But as a scientist, I'm interested in a different problem of other minds,
27
74160
3000
Ali kao naučnica, zainteresovana sam za drugačiji problem njihovih umova,
01:17
and that is the one I'm going to introduce to you today.
28
77160
3000
a to je problem koji ću danas izneti pred vas.
01:20
And that problem is, "How is it so easy
29
80160
2000
A problem glasi, "Kako to da je tako lako
01:22
to know other minds?"
30
82160
2000
proniknuti u umove drugih?“
01:24
So to start with an illustration,
31
84160
2000
Da bih ilustrovala,
01:26
you need almost no information,
32
86160
2000
potrebno je jako malo informacija,
01:28
one snapshot of a stranger,
33
88160
2000
jedna slika nepoznate osobe
01:30
to guess what this woman is thinking,
34
90160
2000
da bi se pogodilo o čemu razmišlja ova žena,
01:32
or what this man is.
35
92160
3000
ili ovaj muškarac.
01:35
And put another way, the crux of the problem is
36
95160
2000
Drugim rečima, srž problema je
01:37
the machine that we use for thinking about other minds,
37
97160
3000
da je mašina koju koristimo za razmišljanje o drugim umovima,
01:40
our brain, is made up of pieces, brain cells,
38
100160
3000
naš mozak, sastavljena od delova, moždanih ćelija
01:43
that we share with all other animals, with monkeys
39
103160
2000
koje delimo sa svim drugim životinjama, sa majmunima,
01:45
and mice and even sea slugs.
40
105160
3000
miševima i čak sa morskim puževima.
01:48
And yet, you put them together in a particular network,
41
108160
3000
A ipak, postavite ih zajedno, tako da čine specifičnu mrežu
01:51
and what you get is the capacity to write Romeo and Juliet.
42
111160
3000
i dobijate potencijal za pisanje "Romea i Julije".
01:54
Or to say, as Alan Greenspan did,
43
114160
2000
Ili, kao što je Alen Grinspen rekao,
01:56
"I know you think you understand what you thought I said,
44
116160
3000
„Znam da misliš da razumeš ono što si mislio da sam rekao,
01:59
but I'm not sure you realize that what you heard
45
119160
2000
ali nisam siguran da shvataš da ono što si čuo
02:01
is not what I meant."
46
121160
2000
nije ono što sam mislio“.
02:03
(Laughter)
47
123160
3000
(Smeh)
02:06
So, the job of my field of cognitive neuroscience
48
126160
2000
Dakle, zadatak moje oblasti kognitivne neuronauke
02:08
is to stand with these ideas,
49
128160
2000
je da se paralelno nosi
02:10
one in each hand.
50
130160
2000
sa ovim idejama.
02:12
And to try to understand how you can put together
51
132160
3000
Kao i da pokuša da razume kako je moguće sastaviti
02:15
simple units, simple messages over space and time, in a network,
52
135160
4000
jednostavne jedinice, jednostavne poruke kroz vreme i prostor, u jednu mrežu
02:19
and get this amazing human capacity to think about minds.
53
139160
4000
da bi se dobio zadivljujući ljudski potencijal za razmišljanje o umovima.
02:23
So I'm going to tell you three things about this today.
54
143160
3000
Dakle, danas ću vam reći tri stvari na tu temu.
02:26
Obviously the whole project here is huge.
55
146160
3000
Očigledno je da je ceo ovaj projekat ogroman.
02:29
And I'm going to tell you just our first few steps
56
149160
3000
I ja ću vam pričati samo o našim prvim koracima,
02:32
about the discovery of a special brain region
57
152160
2000
o otkrivanju posebne moždane regije
02:34
for thinking about other people's thoughts.
58
154160
2000
za razmišljanje o mislima drugih ljudi.
02:36
Some observations on the slow development of this system
59
156160
2000
Zatim, o posmatranjima koja govore o sporom razvoju ovog sistema
02:38
as we learn how to do this difficult job.
60
158160
4000
dok se učimo ovom teškom poslu.
02:42
And then finally, to show that some of the differences
61
162160
2000
I na posletku, pokazaću da neke od razlika
02:44
between people, in how we judge others,
62
164160
3000
među ljudima, u tome kako sudimo o drugima,
02:47
can be explained by differences in this brain system.
63
167160
4000
mogu da se objasne razlikama u ovom moždanom sistemu.
02:51
So first, the first thing I want to tell you is that
64
171160
2000
Dakle pod jedan, prva stvar koju želim da vam ispričam jeste
02:53
there is a brain region in the human brain, in your brains,
65
173160
3000
da u ljuskom mozgu,u vašim mozgovima, postoji moždana regija
02:56
whose job it is to think about other people's thoughts.
66
176160
3000
čiji posao je da misli o mislima drugih ljudi.
02:59
This is a picture of it.
67
179160
2000
Ovo je slika te regije.
03:01
It's called the Right Temporo-Parietal Junction.
68
181160
2000
Zove se desna slepoočno - temena spojnica.
03:03
It's above and behind your right ear.
69
183160
2000
Nalazi se iznad i iza vašeg desnog uha.
03:05
And this is the brain region you used when you saw the pictures I showed you,
70
185160
2000
I to je moždana regija koju ste koristili kada ste videli slike koje sam vam pokazala,
03:07
or when you read Romeo and Juliet
71
187160
2000
ili kada ste čitali "Romeo i Julija",
03:09
or when you tried to understand Alan Greenspan.
72
189160
3000
ili pokušavali da razumete Alena Grinspena.
03:12
And you don't use it for solving any other kinds of logical problems.
73
192160
4000
I nju ne koristite za rešavanje bilo kojih drugih logičkih problema.
03:16
So this brain region is called the Right TPJ.
74
196160
3000
Ovu moždanu regiju zovemo "RTPJ".
03:19
And this picture shows the average activation
75
199160
2000
I ova slika pokazuje prosečnu aktivaciju
03:21
in a group of what we call typical human adults.
76
201160
2000
u grupi takozvanih tipičnih odraslih ljudi.
03:23
They're MIT undergraduates.
77
203160
2000
To su studenti fakulteta na MIT-u.
03:25
(Laughter)
78
205160
4000
(Smeh)
03:29
The second thing I want to say about this brain system
79
209160
2000
Druga stvar koju želim da kažem o ovom moždanom sistemu
03:31
is that although we human adults
80
211160
2000
je da, mada mi kao odrasli ljudi
03:33
are really good at understanding other minds,
81
213160
2000
jesmo stvarno dobri u razumevanju umova drugih ljudi,
03:35
we weren't always that way.
82
215160
2000
nismo uvek bili takvi.
03:37
It takes children a long time to break into the system.
83
217160
3000
Deci je potrebno puno vremena da uđu u sistem.
03:40
I'm going to show you a little bit of that long, extended process.
84
220160
4000
Pokazaću vam delić tog dugog, postepenog procesa.
03:44
The first thing I'm going to show you is a change between age three and five,
85
224160
3000
Prva stvar koju ću vam pokazati je promena između treće i pete godine,
03:47
as kids learn to understand
86
227160
2000
kada deca uče da razumeju
03:49
that somebody else can have beliefs that are different from their own.
87
229160
3000
da neko drugi može imati uverenja koja se razlikuju od njihovih vlastitih.
03:52
So I'm going to show you a five-year-old
88
232160
2000
Tako ću vam pokazati petogodišnjaka
03:54
who is getting a standard kind of puzzle
89
234160
2000
kome je zadata standardna zagonetka
03:56
that we call the false belief task.
90
236160
3000
koju zovemo zadatak lažnog uverenja.
03:59
Rebecca Saxe (Video): This is the first pirate. His name is Ivan.
91
239160
3000
Video: ovo je prvi gusar. Zove se Ivan.
04:02
And you know what pirates really like?
92
242160
2000
A znaš šta gusari mnogo vole?
04:04
Child: What? RS: Pirates really like cheese sandwiches.
93
244160
3000
Gusari mnogo vole sendviče sa sirom.
04:07
Child: Cheese? I love cheese!
94
247160
3000
Dete: sir? Ja obožavam sir!
04:10
RS: Yeah. So Ivan has this cheese sandwich,
95
250160
2000
R.S.: aha. E, Ivan ima ovaj senvič sa sirom
04:12
and he says, "Yum yum yum yum yum!
96
252160
2000
pa kaže „Njam, njam, njam, njam, njam!
04:14
I really love cheese sandwiches."
97
254160
2000
Stvarno volim sendviče sa sirom.“
04:16
And Ivan puts his sandwich over here, on top of the pirate chest.
98
256160
4000
I Ivan stavi sendič ovde, na gusarsku škrinju.
04:20
And Ivan says, "You know what? I need a drink with my lunch."
99
260160
4000
Ivan kaže „Znaš šta? Treba mi piće uz ručak“.
04:24
And so Ivan goes to get a drink.
100
264160
3000
I tako Ivan ode da uzme piće.
04:27
And while Ivan is away
101
267160
2000
A dok Ivan nije tu
04:29
the wind comes,
102
269160
3000
naiđe vetar,
04:32
and it blows the sandwich down onto the grass.
103
272160
2000
oduva sendvič dole na travu.
04:34
And now, here comes the other pirate.
104
274160
4000
I sada, evo i drugog gusara.
04:38
This pirate is called Joshua.
105
278160
3000
Ovaj gusar se zove Džošua.
04:41
And Joshua also really loves cheese sandwiches.
106
281160
2000
I Džošua takođe mnogo voli sendviče sa sirom.
04:43
So Joshua has a cheese sandwich and he says,
107
283160
2000
I Džošua ima sendvič sa sirom i on kaže:
04:45
"Yum yum yum yum yum! I love cheese sandwiches."
108
285160
4000
„Njam, njam, njam, njam, njam! Stvarno volim sendviče sa sirom.“
04:49
And he puts his cheese sandwich over here on top of the pirate chest.
109
289160
3000
I on stavi sendvič sa sirom ovde na gusarsku škrinju.
04:52
Child: So, that one is his.
110
292160
2000
Dete: znači, taj je njegov.
04:54
RS: That one is Joshua's. That's right.
111
294160
2000
R.S.: taj je Džošuin. Tako je.
04:56
Child: And then his went on the ground.
112
296160
2000
Dete: a onda je njegov pao na pod.
04:58
RS: That's exactly right.
113
298160
2000
R.S.: da, baš tako.
05:00
Child: So he won't know which one is his.
114
300160
2000
Dete: znači on neće znati koji je njegov.
05:02
RS: Oh. So now Joshua goes off to get a drink.
115
302160
3000
R.S.: eh. Sada Džošua odlazi da uzme piće.
05:05
Ivan comes back and he says, "I want my cheese sandwich."
116
305160
4000
Ivan se vraća i kaže „Hoću svoj sendvič sa sirom.“
05:09
So which one do you think Ivan is going to take?
117
309160
3000
Pa šta misliš koji će Ivan da uzme?
05:12
Child: I think he is going to take that one.
118
312160
2000
Dete: mislim da će da uzme onaj.
05:14
RS: Yeah, you think he's going to take that one? All right. Let's see.
119
314160
2000
R.S.: misliš da će da uzme taj? U redu. Hajde da vidimo.
05:16
Oh yeah, you were right. He took that one.
120
316160
3000
Aha, bio si u pravu. Uzeo je taj.
05:19
So that's a five-year-old who clearly understands
121
319160
2000
Dakle ovaj petogodišnjak očigledno razume
05:21
that other people can have false beliefs
122
321160
2000
da drugi ljudi mogu imati lažna uverenja
05:23
and what the consequences are for their actions.
123
323160
2000
i kakve su posledice za njihove delatnosti.
05:25
Now I'm going to show you a three-year-old
124
325160
3000
Sada ću vam pokazati trogodišnjaka
05:28
who got the same puzzle.
125
328160
2000
koji je dobio istu zagonetku.
05:30
RS: And Ivan says, "I want my cheese sandwich."
126
330160
2000
Video: R.S.: I Ivan kaže, „ Hoću svoj sendvič sa sirom.“
05:32
Which sandwich is he going to take?
127
332160
3000
Koji sendvič će uzeti?
05:35
Do you think he's going to take that one? Let's see what happens.
128
335160
2000
Misliš li da će uzeti taj? Da vidimo šta će se desiti.
05:37
Let's see what he does. Here comes Ivan.
129
337160
2000
Da vidimo šta će uraditi. Evo Ivana.
05:39
And he says, "I want my cheese sandwich."
130
339160
3000
I on kaže, “Hoću svoj sendvič sa sirom.“
05:42
And he takes this one.
131
342160
2000
I uzima ovaj.
05:44
Uh-oh. Why did he take that one?
132
344160
3000
Opa! Zašto je uzeo taj?
05:47
Child: His was on the grass.
133
347160
4000
Dete: njegov je bio u travi.
05:51
So the three-year-old does two things differently.
134
351160
3000
R.S.: dakle, trogodišnjak dve stvari radi drugačije.
05:54
First, he predicts Ivan will take the sandwich
135
354160
3000
Prvo, predviđa da će Ivan uzeti sendvič
05:57
that's really his.
136
357160
2000
koji je stvarno njegov.
05:59
And second, when he sees Ivan taking the sandwich where he left his,
137
359160
4000
Drugo, kada vidi kako Ivan uzima sendvič tamo gde je ostavio svoj,
06:03
where we would say he's taking that one because he thinks it's his,
138
363160
3000
na šta bi mi rekli da ga uzima jer misli da je njegov,
06:06
the three-year-old comes up with another explanation:
139
366160
3000
trogodišnjak smišlja drugo objašnjenje.
06:09
He's not taking his own sandwich because he doesn't want it,
140
369160
2000
On ne uzima svoj sendvič zato što ga ne želi,
06:11
because now it's dirty, on the ground.
141
371160
2000
već zato što je sada prljav, na zemlji.
06:13
So that's why he's taking the other sandwich.
142
373160
2000
Dakle, zbog toga uzima drugi sendvič.
06:15
Now of course, development doesn't end at five.
143
375160
4000
Naravno, razvoj se ne završava u petoj godini.
06:19
And we can see the continuation of this process
144
379160
2000
Možemo videti nastavak ovog procesa
06:21
of learning to think about other people's thoughts
145
381160
2000
učenja da se misli o mislima drugih
06:23
by upping the ante
146
383160
2000
tako što ćemo podići ulog
06:25
and asking children now, not for an action prediction,
147
385160
3000
i decu više ne pitati da predvide postupke
06:28
but for a moral judgment.
148
388160
2000
već da donesu moralni sud.
06:30
So first I'm going to show you the three-year-old again.
149
390160
2000
Dakle, prvo ću vam ponovo pokazati trogodišnjaka.
06:32
RS.: So is Ivan being mean and naughty for taking Joshua's sandwich?
150
392160
3000
Video: R.S.: pa je li Ivan zločest i nevaljao što je uzeo Džošuin sendvič?
06:35
Child: Yeah.
151
395160
1000
Dete: da.
06:36
RS: Should Ivan get in trouble for taking Joshua's sandwich?
152
396160
3000
R.S.: da li Ivan treba da bude kažnjen zato što je uzeo Džošuin sendvič?
06:39
Child: Yeah.
153
399160
2000
Dete: da.
06:41
So it's maybe not surprising he thinks it was mean of Ivan
154
401160
2000
R.S.: pa možda nije iznenađujuće što misli da je Ivan bio zločest
06:43
to take Joshua's sandwich,
155
403160
2000
kad je uzeo Džošuin sendvič.
06:45
since he thinks Ivan only took Joshua's sandwich
156
405160
2000
Budući da misli da je Ivan uzeo Džošuin sendvič
06:47
to avoid having to eat his own dirty sandwich.
157
407160
3000
samo da bi izbegao da jede svoj prljavi sendvič.
06:50
But now I'm going to show you the five-year-old.
158
410160
2000
Ali sada ću vam pokazati petogodišnjaka.
06:52
Remember the five-year-old completely understood
159
412160
2000
Setite se da je petogodišnjak u potpunosti razumeo
06:54
why Ivan took Joshua's sandwich.
160
414160
2000
zašto je Ivan uzeo Džošuin sendvič.
06:56
RS: Was Ivan being mean and naughty
161
416160
2000
Video: R.S.: da li je Ivan zločest i nevaljao
06:58
for taking Joshua's sandwich?
162
418160
2000
zato što je uzeo Džošuin sendvič?
07:00
Child: Um, yeah.
163
420160
2000
Dete: hm, jeste.
07:02
And so, it is not until age seven
164
422160
2000
R.S.: i tako, pre uzrasta od sedam godina
07:04
that we get what looks more like an adult response.
165
424160
3000
ne dobijamo odgovor koji liči na odgovor odraslog.
07:07
RS: Should Ivan get in trouble for taking Joshua's sandwich?
166
427160
3000
Video: R.S.: da li Ivan treba da bude kažnjen zato što je uzeo Džošuin sendvič?
07:10
Child: No, because the wind should get in trouble.
167
430160
2000
Dete: ne, zato što vetar treba da bude kažnjen.
07:12
He says the wind should get in trouble
168
432160
3000
R.S.: kaže da vetar treba da bude kažnjen
07:15
for switching the sandwiches.
169
435160
2000
jer je zamenio sendviče.
07:17
(Laughter)
170
437160
2000
(Smeh)
07:19
And now what we've started to do in my lab
171
439160
2000
I sad, u mojoj laboratoriji počeli smo
07:21
is to put children into the brain scanner
172
441160
2000
da stavljamo decu u aparat za skeniranje mozga
07:23
and ask what's going on in their brain
173
443160
3000
i postavljamo pitanje šta se dešava u njihovom mozgu
07:26
as they develop this ability to think about other people's thoughts.
174
446160
3000
dok se razvija ova sposobnost razmišljanja o mislima drugih ljudi.
07:29
So the first thing is that in children we see this same brain region, the Right TPJ,
175
449160
4000
I prva stvar je da kod dece uočavamo aktivnost iste moždane regije RTPJ,
07:33
being used while children are thinking about other people.
176
453160
3000
dok deca razmišljaju o drugim ljudima.
07:36
But it's not quite like the adult brain.
177
456160
2000
Ali postoje razlike u odnosu na mozak odraslih.
07:38
So whereas in the adults, as I told you,
178
458160
2000
Dok je kod odraslih, kao što sam vam rekla,
07:40
this brain region is almost completely specialized --
179
460160
3000
ova moždana regija skoro u potpunosti specijalizovana.
07:43
it does almost nothing else except for thinking about other people's thoughts --
180
463160
3000
Skoro da ne radi ništa drugo, osim razmišljanja o mislima drugih ljudi.
07:46
in children it's much less so,
181
466160
2000
Kod dece to važi u mnogo manjoj meri,
07:48
when they are age five to eight,
182
468160
2000
kada su u uzrastu od pet do osam godina,
07:50
the age range of the children I just showed you.
183
470160
2000
što je uzrast dece koju sam vam upravo pokazala.
07:52
And actually if we even look at eight to 11-year-olds,
184
472160
3000
I zapravo, ako pogledamo čak i uzrast od osam do jedanaest godina,
07:55
getting into early adolescence,
185
475160
2000
na pragu rane adolescencije,
07:57
they still don't have quite an adult-like brain region.
186
477160
3000
još uvek nemaju u potpunosti istu moždanu regiju kao odrasli.
08:00
And so, what we can see is that over the course of childhood
187
480160
3000
tako, možemo videti da tokom detinjstva
08:03
and even into adolescence,
188
483160
2000
i čak u ranoj adolescenciji,
08:05
both the cognitive system,
189
485160
2000
kognitivni sistem,
08:07
our mind's ability to think about other minds,
190
487160
2000
sposobnost uma da razmišlja o umovima drugih ljudi,
08:09
and the brain system that supports it
191
489160
2000
kao i moždani sistem koji to podržava,
08:11
are continuing, slowly, to develop.
192
491160
3000
nastavljaju polako da se razvijaju.
08:14
But of course, as you're probably aware,
193
494160
2000
Ali, naravno, verovatno ste i sami svesni
08:16
even in adulthood,
194
496160
2000
da se čak i u odraslom dobu
08:18
people differ from one another in how good they are
195
498160
2000
ljudi razlikuju međusobno po sposobnosti
08:20
at thinking of other minds, how often they do it
196
500160
2000
da razmišljaju o umovima dugih ljudi, koliko često to rade
08:22
and how accurately.
197
502160
2000
i sa kojim stepenom tačnosti.
08:24
And so what we wanted to know was, could differences among adults
198
504160
3000
Dakle, mi smo želeli da saznamo, da li je moguće objasniti razlike među odraslima
08:27
in how they think about other people's thoughts
199
507160
2000
u tome kako razmišljaju o mislima drugih ljudi,
08:29
be explained in terms of differences in this brain region?
200
509160
3000
putem razlika koje postoje u ovoj moždanoj regiji.
08:32
So, the first thing that we did is we gave adults a version
201
512160
3000
Zato smo najpre zadali odraslima varijantu
08:35
of the pirate problem that we gave to the kids.
202
515160
2000
gusarske dileme koju smo davali deci.
08:37
And I'm going to give that to you now.
203
517160
2000
I sada ću je ja zadati vama.
08:39
So Grace and her friend are on a tour of a chemical factory,
204
519160
3000
Dakle, Grejs i njena prijateljica obilaze hemijsko postrojenje
08:42
and they take a break for coffee.
205
522160
2000
i naprave pauzu da popiju kafu.
08:44
And Grace's friend asks for some sugar in her coffee.
206
524160
3000
Grejsina prijateljica traži šećer u svojoj kafi.
08:47
Grace goes to make the coffee
207
527160
3000
Grejs odlazi da napravi kafu
08:50
and finds by the coffee a pot
208
530160
2000
i pored kafe nalazi posudu
08:52
containing a white powder, which is sugar.
209
532160
3000
sa belim prahom, tačnije šećerom.
08:55
But the powder is labeled "Deadly Poison,"
210
535160
3000
Ali prah je obeležen kao „ Smrtonosni otrov“.
08:58
so Grace thinks that the powder is a deadly poison.
211
538160
3000
Stoga Grejs misli da je prah smrtonosni otrov.
09:01
And she puts it in her friend's coffee.
212
541160
2000
I ona stavi taj prah prijateljici u kafu.
09:03
And her friend drinks the coffee, and is fine.
213
543160
3000
Njena prijateljica popije kafu i ne bude joj ništa.
09:06
How many people think it was morally permissible
214
546160
2000
Koliko vas smatra da je bilo moralno dopustivo
09:08
for Grace to put the powder in the coffee?
215
548160
4000
to što je Grejs stavila prah u kafu?
09:12
Okay. Good. (Laughter)
216
552160
3000
U redu. Dobro je. (Smeh)
09:15
So we ask people, how much should Grace be blamed
217
555160
3000
Dakle, ljude smo pitali koliko krivice Grejs snosi,
09:18
in this case, which we call a failed attempt to harm?
218
558160
2000
u ovom slučaju, koji smo nazvali "neuspeli pokušaj da se nanese povreda".
09:20
And we can compare that to another case,
219
560160
2000
I ovo možemo uporediti sa drugim slučajem
09:22
where everything in the real world is the same.
220
562160
2000
gde je u stvarnosti sve isto.
09:24
The powder is still sugar, but what's different is what Grace thinks.
221
564160
3000
Prah je i dalje šećer, ali ono što se razlikuje je šta Grejs misli.
09:27
Now she thinks the powder is sugar.
222
567160
3000
Sada ona misli da je prah šećer.
09:30
And perhaps unsurprisingly, if Grace thinks the powder is sugar
223
570160
3000
I možda nije nikakvo iznenađenje, ako Grajs misli da je prah šećer
09:33
and puts it in her friend's coffee,
224
573160
2000
i stavi ga prijateljici u kafu,
09:35
people say she deserves no blame at all.
225
575160
2000
što ljudi kažu da nije ni za šta kriva.
09:37
Whereas if she thinks the powder was poison, even though it's really sugar,
226
577160
4000
Dok u slučaju kada misli da je prah otrov, iako je ustvari šećer,
09:41
now people say she deserves a lot of blame,
227
581160
3000
kažu da je njena krivica velika,
09:44
even though what happened in the real world was exactly the same.
228
584160
3000
iako je ono što se u stvarnosti desilo potpuno isto.
09:47
And in fact, they say she deserves more blame
229
587160
2000
Zapravo, ljudi kažu da je njena krivica veća
09:49
in this case, the failed attempt to harm,
230
589160
2000
u ovom slučaju, neuspelom pokušaju da se nanese povreda,
09:51
than in another case,
231
591160
2000
nego u drugom,
09:53
which we call an accident.
232
593160
2000
koji nazivamo nesrećom.
09:55
Where Grace thought the powder was sugar,
233
595160
2000
Tu je Grejs mislila da je prah šećer,
09:57
because it was labeled "sugar" and by the coffee machine,
234
597160
2000
zato što je imao natpis „ Šećer“ i stajao je pored mašine za kafu,
09:59
but actually the powder was poison.
235
599160
2000
ali je ustvari prah bio otrov.
10:01
So even though when the powder was poison,
236
601160
3000
Pa čak i u slučaju kada je prah bio otrov,
10:04
the friend drank the coffee and died,
237
604160
3000
prijateljica popila kafu i umrla,
10:07
people say Grace deserves less blame in that case,
238
607160
3000
ljudi kažu da Grejs snosi manje krivice u ovom slučaju,
10:10
when she innocently thought it was sugar,
239
610160
2000
kada je nevino mislila da je to šećer,
10:12
than in the other case, where she thought it was poison
240
612160
2000
nego u drugom slučaju, kada je mislila da je otrov
10:14
and no harm occurred.
241
614160
3000
i nikome nije naudila.
10:17
People, though, disagree a little bit
242
617160
2000
Ipak ljudi se ne slažu u potpunosti sa tim
10:19
about exactly how much blame Grace should get
243
619160
2000
kolika je krivica koja pada na Grejs
10:21
in the accident case.
244
621160
2000
u slučaju nesreće.
10:23
Some people think she should deserve more blame,
245
623160
2000
Neki misle da je njena krivica veća,
10:25
and other people less.
246
625160
2000
neki da je manja.
10:27
And what I'm going to show you is what happened when we look inside
247
627160
2000
A ja ću vam pokazati šta smo našli kada smo pogledali unutra,
10:29
the brains of people while they're making that judgment.
248
629160
2000
u mozgove ljudi dok su donosili sud o tome.
10:31
So what I'm showing you, from left to right,
249
631160
2000
Dakle, sada vam prikazujem, sleva na desno,
10:33
is how much activity there was in this brain region,
250
633160
3000
kolika je bila aktivnost u ovoj moždanoj regiji
10:36
and from top to bottom, how much blame
251
636160
2000
i od vrha ka dnu, koliko krivice
10:38
people said that Grace deserved.
252
638160
2000
ljudi pripisuju Grejs.
10:40
And what you can see is, on the left
253
640160
2000
I možete da vidite, na levoj strani,
10:42
when there was very little activity in this brain region,
254
642160
2000
kada je bilo vrlo malo aktivnosti u ovoj moždanoj regiji,
10:44
people paid little attention to her innocent belief
255
644160
3000
ljudi su malo pažnje poklanjali nevinosti njenog uverenja
10:47
and said she deserved a lot of blame for the accident.
256
647160
3000
i rekli su da je na njoj velika krivica za nesreću.
10:50
Whereas on the right, where there was a lot of activity,
257
650160
2000
Nasuprot tome, sa desne strane, kada je bilo intenzivne aktivnosti,
10:52
people paid a lot more attention to her innocent belief,
258
652160
3000
ljudi su u većoj meri obratili pažnju na nevinost njenih uverenja
10:55
and said she deserved a lot less blame
259
655160
2000
i rekli da je njena krivica za izazivanje nesreće
10:57
for causing the accident.
260
657160
2000
mnogo manja.
10:59
So that's good, but of course
261
659160
2000
Pa to je dobro, ali naravno,
11:01
what we'd rather is have a way to interfere
262
661160
2000
želeli bismo da imamo načina da utičemo
11:03
with function in this brain region,
263
663160
2000
na funkcionisanje ove moždane regije
11:05
and see if we could change people's moral judgment.
264
665160
3000
i pokušamo da promenimo moralno rasuđivanje ljudi.
11:08
And we do have such a tool.
265
668160
2000
A mi imamo instrument za to.
11:10
It's called Trans-Cranial Magnetic Stimulation,
266
670160
2000
To je magnetna stimulacija kroz lobanju,
11:12
or TMS.
267
672160
2000
ili skraćeno "TMS".
11:14
This is a tool that lets us pass a magnetic pulse
268
674160
2000
Ovaj instrument nam omogućuje da propustimo impuls magnetnog polja
11:16
through somebody's skull, into a small region of their brain,
269
676160
4000
kroz nečiju lobanju, ciljano u malu regiju njihovog mozga
11:20
and temporarily disorganize the function of the neurons in that region.
270
680160
4000
i privremeno poremetimo funkciju neurona u toj regiji.
11:24
So I'm going to show you a demo of this.
271
684160
2000
Dakle, pokazaću vam demonstraciju.
11:26
First, I'm going to show you that this is a magnetic pulse.
272
686160
3000
Najpre ću vam pokazati, da bih vam pokazala da je ovo impuls magnetnog polja,
11:29
I'm going to show you what happens when you put a quarter on the machine.
273
689160
3000
pokazaću vam šta se dešava kada stavite novčić u mašinu.
11:32
When you hear clicks, we're turning the machine on.
274
692160
4000
Kada čujete kliktanje to mi uključujemo mašinu.
11:42
So now I'm going to apply that same pulse to my brain,
275
702160
3000
A sada ću primeniti taj isti impuls na svoj mozak,
11:45
to the part of my brain that controls my hand.
276
705160
2000
na deo mozga koji kontroliše moju ruku.
11:47
So there is no physical force, just a magnetic pulse.
277
707160
3000
Dakle, nema fizičke sile, samo impuls magnetnog polja.
11:54
Woman (Video): Ready, Rebecca? RS: Yes.
278
714160
2000
Video: Žena: Spremni? Rebeka Saks: Da.
11:57
Okay, so it causes a small involuntary contraction in my hand
279
717160
3000
U redu, dakle to uzrokuje slabu nevoljnu kontrakciju u mojoj ruci
12:00
by putting a magnetic pulse in my brain.
280
720160
3000
izazvanu propuštanjem impulsa magnetnog polja kroz moj mozak.
12:03
And we can use that same pulse,
281
723160
2000
I možemo koristiti taj isti impuls,
12:05
now applied to the RTPJ,
282
725160
2000
u ovom slučaju usmeren na RTPJ region,
12:07
to ask if we can change people's moral judgments.
283
727160
3000
da vidimo da li možemo da promenimo moralne sudove ljudi.
12:10
So these are the judgments I showed you before, people's normal moral judgments.
284
730160
2000
Dakle, ovo su sudovi koje sam vam prethodno pokazala, normalni moralni sudovi ljudi.
12:12
And then we can apply TMS to the RTPJ
285
732160
3000
A zatim možemo primeniti TMS na RTPJ regiju
12:15
and ask how people's judgments change.
286
735160
2000
i videti kako se sudovi koje ljudi donose menjaju.
12:17
And the first thing is, people can still do this task overall.
287
737160
4000
Prva stvar je da ljudi i dalje mogu da obave ovaj zadatak u celini gledano.
12:21
So their judgments of the case when everything was fine
288
741160
2000
Pa, njihovi sudovi u slučaju kada je sve bilo u redu
12:23
remain the same. They say she deserves no blame.
289
743160
3000
ostaju isti. Oni kažu da uopšte nije kriva.
12:26
But in the case of a failed attempt to harm,
290
746160
4000
Ali u slučaju neuspelog pokušaja da se nanese povreda,
12:30
where Grace thought that it was poison, although it was really sugar,
291
750160
3000
kada je Grejs mislila da je to otrov, iako je to zapravo bio šećer,
12:33
people now say it was more okay, she deserves less blame
292
753160
3000
ljudi sada kažu da je to prihvatljivije, na njoj je manja krivica
12:36
for putting the powder in the coffee.
293
756160
3000
zbog stavljanja praha u kafu.
12:39
And in the case of the accident, where she thought that it was sugar,
294
759160
2000
A u slučaju nesreće, kada je mislila da je to šećer,
12:41
but it was really poison and so she caused a death,
295
761160
3000
ali je to zapravo bio otrov kojim je ona izazvala smrt,
12:44
people say that it was less okay, she deserves more blame.
296
764160
6000
ljudi kažu da je to manje prihvatljivo, njena krivica je veća.
12:50
So what I've told you today is that
297
770160
2000
Dakle, danas sam vam ispričala kako
12:52
people come, actually, especially well equipped
298
772160
4000
su ljudi zapravo, posebno dobro opremljeni
12:56
to think about other people's thoughts.
299
776160
2000
da razmišljaju o mislima drugih ljudi.
12:58
We have a special brain system
300
778160
2000
Posedujemo posebni moždani sistem
13:00
that lets us think about what other people are thinking.
301
780160
3000
koji nam omogućuje da mislimo o tome šta je sadržaj misli drugih.
13:03
This system takes a long time to develop,
302
783160
2000
Ovom sistemu potrebno je dosta vremena da se razvije,
13:05
slowly throughout the course of childhood and into early adolescence.
303
785160
3000
to čini polako tokom čitavog detinjstva i rane adolescencije.
13:08
And even in adulthood, differences in this brain region
304
788160
3000
Čak i u odraslom dobu, razlike u ovom moždanom regionu
13:11
can explain differences among adults
305
791160
2000
mogu da objasne razlike među odraslim individuama
13:13
in how we think about and judge other people.
306
793160
3000
u tome kako razmišljamo i rasuđujemo o drugim ljudima.
13:16
But I want to give the last word back to the novelists,
307
796160
3000
Ali poslednju reč na ovu temu želim da vratim piscima.
13:19
and to Philip Roth, who ended by saying,
308
799160
3000
Tačnije, Filipu Rotu koji na kraju kaže:
13:22
"The fact remains that getting people right
309
802160
2000
„Izvesno je da razumeti ljude
13:24
is not what living is all about anyway.
310
804160
2000
nije ono što čini život.
13:26
It's getting them wrong that is living.
311
806160
2000
Živeti jeste razumeti ih pogrešno.
13:28
Getting them wrong and wrong and wrong,
312
808160
3000
Razumeti ih pogrešno, pogrešno i pogrešno
13:31
and then on careful reconsideration,
313
811160
2000
i onda nakon pažljivog premišljanja,
13:33
getting them wrong again."
314
813160
2000
ponovo ih pogrešno razumeti.“
13:35
Thank you.
315
815160
2000
Hvala vam
13:37
(Applause)
316
817160
10000
(Aplauz)
13:47
Chris Anderson: So, I have a question. When you start talking about using
317
827160
2000
Kris Anderson: Kada pričate o korišćenju
13:49
magnetic pulses to change people's moral judgments,
318
829160
3000
impulsa magnetnog polja za menjanje moralnog rasuđivanja ljudi,
13:52
that sounds alarming.
319
832160
3000
zvuči uznemirujuće.
13:55
(Laughter)
320
835160
1000
(Smeh)
13:56
Please tell me that you're not taking phone calls from the Pentagon, say.
321
836160
4000
Molim Vas, recite da ne primate telefonske pozive iz, recimo, Pentagona.
14:00
RS: I'm not.
322
840160
2000
Rebeka Saks: Ne.
14:02
I mean, they're calling, but I'm not taking the call.
323
842160
3000
Mislim, oni zovu, ali ja ne primam pozive.
14:05
(Laughter)
324
845160
1000
(Smeh)
14:06
CA: They really are calling?
325
846160
2000
C.A.: Stvarno zovu?
14:08
So then seriously,
326
848160
3000
Pa, sad ozbiljno, stvarno ozbiljno,
14:11
you must lie awake at night sometimes
327
851160
3000
mora da ponekad ležite budni noću
14:14
wondering where this work leads.
328
854160
2000
i pitate se gde vodi vaš rad.
14:16
I mean, you're clearly an incredible human being,
329
856160
2000
Hoću reći, očigledno je da ste izuzetna osoba.
14:18
but someone could take this knowledge
330
858160
3000
Ali neko bi mogao da uzme ovo znanje
14:21
and in some future
331
861160
2000
i u nekoj budućoj
14:23
not-torture chamber,
332
863160
2000
ćeliji koja nije za mučenje
14:25
do acts that people here might be worried about.
333
865160
3000
počini stvari koje bi zabrinule ovde prisutne.
14:28
RS: Yeah, we worry about this.
334
868160
2000
R.S.: Da, to nas brine.
14:30
So, there's a couple of things to say about TMS.
335
870160
3000
Par stvari treba reći o TMS-u.
14:33
One is that you can't be TMSed without knowing it.
336
873160
2000
Jedna je da ne možete biti izloženi TMS-u bez vašeg znanja.
14:35
So it's not a surreptitious technology.
337
875160
3000
Nije to tehnologija koja se može primeniti u tajnosti.
14:38
It's quite hard, actually, to get those very small changes.
338
878160
3000
Zapravo je jako teško postići te jako male promene.
14:41
The changes I showed you are impressive to me
339
881160
3000
Promene koje sam vam pokazala su meni impresivne
14:44
because of what they tell us about the function of the brain,
340
884160
2000
zbog onoga što iz njih saznajemo o funkcionisanju mozga.
14:46
but they're small on the scale
341
886160
2000
Ali one su male u poređenju sa
14:48
of the moral judgments that we actually make.
342
888160
2000
moralnim sudovima koje zapravo donosimo.
14:50
And what we changed was not people's
343
890160
2000
I mi kod ljudi nismo promenili
14:52
moral judgments when they're deciding what to do,
344
892160
3000
moralne sudove koje donose kad odlučuju šta da rade,
14:55
when they're making action choices.
345
895160
2000
kada biraju pravac delovanja.
14:57
We changed their ability to judge other people's actions.
346
897160
3000
Mi menjamo njihovu sposobnost da procenjuju delovanje drugih ljudi.
15:00
And so, I think of what I'm doing not so much as
347
900160
2000
O svom radu razmišljam ne kao o
15:02
studying the defendant in a criminal trial,
348
902160
2000
proučavanju okrivljenog u krivičnom postupku,
15:04
but studying the jury.
349
904160
2000
već kao o proučavanju porote.
15:06
CA: Is your work going to lead to any recommendations
350
906160
3000
K.A.: Da li će vaš rad dovesti do nekih preporuka
15:09
in education, to perhaps bring up
351
909160
3000
za obrazovanje, možda o vaspitanju
15:12
a generation of kids able to make fairer moral judgments?
352
912160
5000
generacija dece sposobnih da donose pravednije moralne sudove?
15:17
RS: That's one of the idealistic hopes.
353
917160
3000
R.S.: To je deo idealističkih nadanja.
15:20
The whole research program here of studying
354
920160
4000
Ceo ovaj istraživački poduhvat, proučavanja
15:24
the distinctive parts of the human brain is brand new.
355
924160
4000
pojedinih delova ljudskog mozga je u potpunosti nov.
15:28
Until recently, what we knew about the brain
356
928160
2000
Sve do skora o mozgu smo znali stvari
15:30
were the things that any other animal's brain could do too,
357
930160
3000
svojstvene i bilo kojem drugom životinjskom mozgu.
15:33
so we could study it in animal models.
358
933160
2000
Mogli smo da proučavamo na modelima životinja.
15:35
We knew how brains see, and how they control the body
359
935160
2000
Znali smo kako mozgovi vide, kontrolišu telo,
15:37
and how they hear and sense.
360
937160
2000
kako čuju i kakva im je percepcija.
15:39
And the whole project of understanding
361
939160
3000
A ceo poduhvat razumevanja
15:42
how brains do the uniquely human things --
362
942160
2000
kako mozak čini stvari jedinstveno ljudske,
15:44
learn language and abstract concepts,
363
944160
3000
uči jezik, i apstraktne pojmove,
15:47
and thinking about other people's thoughts -- that's brand new.
364
947160
2000
razmišlja o mislima drugih ljudi, to je u potpunosti novo.
15:49
And we don't know yet what the implications will be
365
949160
2000
I mi još uvek ne znamo koje su implikacije
15:51
of understanding it.
366
951160
2000
ovakvog razumevanja.
15:53
CA: So I've got one last question. There is this thing called
367
953160
2000
C.A.: Pa, samo još jedno pitanje. Postoji nešto što zovemo
15:55
the hard problem of consciousness,
368
955160
2000
teškim problemom svesnosti,
15:57
that puzzles a lot of people.
369
957160
2000
koji kopka mnoge ljude.
15:59
The notion that you can understand
370
959160
3000
Ideja da je moguće razumeti
16:02
why a brain works, perhaps.
371
962160
2000
zašto mozak radi je možda prihvatljiva.
16:04
But why does anyone have to feel anything?
372
964160
3000
Ali zašto je potrebno da iko ima osećanja?
16:07
Why does it seem to require these beings who sense things
373
967160
3000
Zašto je neophodno da ova bića osećaju stvari
16:10
for us to operate?
374
970160
2000
da bi funkcionisala?
16:12
You're a brilliant young neuroscientist.
375
972160
3000
Vi ste izuzetna mlada naučnica.
16:15
I mean, what chances do you think there are
376
975160
2000
Mislim, kakva je po Vama šansa
16:17
that at some time in your career,
377
977160
2000
da u nekom trenutku tokom vaše karijere,
16:19
someone, you or someone else,
378
979160
2000
Vi ili neko drugi,
16:21
is going to come up with some paradigm shift
379
981160
2000
dođete do promene paradigme
16:23
in understanding what seems an impossible problem?
380
983160
4000
u razumevanju ovog, naizgled, nemogućeg problema.
16:27
RS: I hope they do. And I think they probably won't.
381
987160
4000
R.S.: Nadam se da hoće. I mislim da je malo verovatno.
16:31
CA: Why?
382
991160
3000
C.A.: Zašto?
16:34
RS: It's not called the hard problem of consciousness for nothing.
383
994160
3000
R.S.: Ne zove se teški problem svesnosti bez razloga.
16:37
(Laughter)
384
997160
2000
(Smeh)
16:39
CA: That's a great answer. Rebecca Saxe, thank you very much. That was fantastic.
385
999160
3000
C.A.: Odličan odgovor. Rebeka Saks, najlepše Vam hvala. Bilo je fantastično.
16:42
(Applause)
386
1002160
4000
(Aplauz)
About this website

This site will introduce you to YouTube videos that are useful for learning English. You will see English lessons taught by top-notch teachers from around the world. Double-click on the English subtitles displayed on each video page to play the video from there. The subtitles scroll in sync with the video playback. If you have any comments or requests, please contact us using this contact form.

https://forms.gle/WvT1wiN1qDtmnspy7